diff --git a/.github/workflows/python-syntax-check.yml b/.github/workflows/python-syntax-check.yml new file mode 100644 index 00000000..34527de9 --- /dev/null +++ b/.github/workflows/python-syntax-check.yml @@ -0,0 +1,51 @@ +name: Python Syntax Check + +on: + pull_request: + branches: + - main + - Development + paths: + - 'application/single_app/**.py' + - '.github/workflows/python-syntax-check.yml' + +jobs: + syntax-check: + runs-on: ubuntu-latest + + steps: + - name: Checkout code + uses: actions/checkout@v4 + + - name: Set up Python + uses: actions/setup-python@v5 + with: + python-version: '3.11' + + - name: Run Python compilation check + run: | + cd application/single_app + echo "🔍 Running Python compilation checks on all .py files..." + failed_files=() + + for file in *.py; do + echo "" + echo "=== Compiling $file ===" + if python -m py_compile "$file" 2>&1; then + echo "✓ $file - OK" + else + echo "✗ $file - FAILED" + failed_files+=("$file") + fi + done + + echo "" + echo "================================" + if [ ${#failed_files[@]} -eq 0 ]; then + echo "✅ All Python files compiled successfully!" + exit 0 + else + echo "❌ ${#failed_files[@]} file(s) failed compilation:" + printf ' - %s\n' "${failed_files[@]}" + exit 1 + fi diff --git a/README.md b/README.md index 7f19f5ff..3f5dcb76 100644 --- a/README.md +++ b/README.md @@ -14,10 +14,111 @@ The application utilizes **Azure Cosmos DB** for storing conversations, metadata ## Quick Deploy -Use azd up [MORE DETAILS TO COME] +[Detailed deployment Guide](./deployers/bicep/README.md) +### Pre-Configuration: + +The following procedure must be completed with a user that has permissions to create an application registration in the users Entra tenant. + +#### Create the application registration: + +```powershell +cd ./deployers +``` + +Define your application name and your environment: + +``` +appName = +``` + +``` +environment = +``` + +The following script will create an Entra Enterprise Application, with an App Registration named *\*-*\*-ar for the web service called *\*-*\*-app. + +> [!TIP] +> +> The web service name may be overriden with the `-AppServceName` parameter. + +> [!TIP] +> +> A different expiration date for the secret which defaults to 180 days with the `-SecretExpirationDays` parameter. + +```powershell +.\Initialize-EntraApplication.ps1 -AppName "" -Environment "" -AppRolesJsonPath "./azurecli/appRegistrationRoles.json" ``` -azd up + +> [!NOTE] +> +> Be sure to save this information as it will not be available after the window is closed.* + +```======================================== +App Registration Created Successfully! +Application Name: +Client ID: +Tenant ID: +Service Principal ID: +Client Secret: +Secret Expiration: +``` + +In addition, the script will note additional steps that must be taken for the app registration step to be completed. + +1. Grant Admin Consent for API Permissions: + + - Navigate to Azure Portal > Entra ID > App registrations + - Find app: *\* + - Go to API permissions + - Click 'Grant admin consent for [Tenant]' + +2. Assign Users/Groups to Enterprise Application: + - Navigate to Azure Portal > Entra ID > Enterprise applications + - Find app: *\* + - Go to Users and groups + - Add user/group assignments with appropriate app roles + +3. Store the Client Secret Securely: + - Save the client secret in Azure Key Vault or secure credential store + - The secret value is shown above and will not be displayed again + +#### Configure AZD Environment + +Using the bash terminal in Visual Studio Code + +```powershell +cd ./deployers +``` + +If you work with other Azure clouds, you may need to update your cloud like `azd config set cloud.name AzureUSGovernment` - more information here - [Use Azure Developer CLI in sovereign clouds | Microsoft Learn](https://learn.microsoft.com/en-us/azure/developer/azure-developer-cli/sovereign-clouds) + +```powershell +azd config set cloud.name AzureCloud +``` + +This will open a browser window that the user with Owner level permissions to the target subscription will need to authenticate with. + +```powershell +azd auth login +``` + +Use the same value for the \ that was used in the application registration. + +```powershell +azd env new +``` + +Select the new environment + +```powershell +azd env select +``` + +This step will begin the deployment process. + +```powershell +Use azd up ``` ## Architecture @@ -27,50 +128,27 @@ azd up ## Features - **Chat with AI**: Interact with an AI model based on Azure OpenAI’s GPT and Thinking models. - - **RAG with Hybrid Search**: Upload documents and perform hybrid searches (vector + keyword), retrieving relevant information from your files to augment AI responses. - - **Document Management**: Upload, store, and manage multiple versions of documents—personal ("Your Workspace") or group-level ("Group Workspaces"). - - **Group Management**: Create and join groups to share access to group-specific documents, enabling collaboration with Role-Based Access Control (RBAC). - - **Ephemeral (Single-Convo) Documents**: Upload temporary documents available only during the current chat session, without persistent storage in Azure AI Search. - - **Conversation Archiving (Optional)**: Retain copies of user conversations—even after deletion from the UI—in a dedicated Cosmos DB container for audit, compliance, or legal requirements. - - **Content Safety (Optional)**: Integrate Azure AI Content Safety to review every user message *before* it reaches AI models, search indexes, or image generation services. Enforce custom filters and compliance policies, with an optional `SafetyAdmin` role for viewing violations. - - **Feedback System (Optional)**: Allow users to rate AI responses (thumbs up/down) and provide contextual comments on negative feedback. Includes user and admin dashboards, governed by an optional `FeedbackAdmin` role. - - **Bing Web Search (Optional)**: Augment AI responses with live Bing search results, providing up-to-date information. Configurable via Admin Settings. - - **Image Generation (Optional)**: Enable on-demand image creation using Azure OpenAI's DALL-E models, controlled via Admin Settings. - - **Video Extraction (Optional)**: Utilize Azure Video Indexer to transcribe speech and perform Optical Character Recognition (OCR) on video frames. Segments are timestamp-chunked for precise retrieval and enhanced citations linking back to the video timecode. - - **Audio Extraction (Optional)**: Leverage Azure Speech Service to transcribe audio files into timestamped text chunks, making audio content searchable and enabling enhanced citations linked to audio timecodes. - - **Document Classification (Optional)**: Admins define custom classification types and associated colors. Users tag uploaded documents with these labels, which flow through to AI conversations, providing lineage and insight into data sensitivity or type. - - **Enhanced Citation (Optional)**: Store processed, chunked files in Azure Storage (organized into user- and document-scoped folders). Display interactive citations in the UI—showing page numbers or timestamps—that link directly to the source document preview. - - **Metadata Extraction (Optional)**: Apply an AI model (configurable GPT model via Admin Settings) to automatically generate keywords, two-sentence summaries, and infer author/date for uploaded documents. Allows manual override for richer search context. - - **File Processing Logs (Optional)**: Enable verbose logging for all ingestion pipelines (workspaces and ephemeral chat uploads) to aid in debugging, monitoring, and auditing file processing steps. - - **Redis Cache (Optional)**: Integrate Azure Cache for Redis to provide a distributed, high-performance session store. This enables true horizontal scaling and high availability by decoupling user sessions from individual app instances. - - **Authentication & RBAC**: Secure access via Azure Active Directory (Entra ID) using MSAL. Supports Managed Identities for Azure service authentication, group-based controls, and custom application roles (`Admin`, `User`, `CreateGroup`, `SafetyAdmin`, `FeedbackAdmin`). - - **Supported File Types**: - - Text: `txt`, `md`, `html`, `json` - - * Documents: `pdf`, `docx`, `pptx`, `xlsx`, `xlsm`, `xls`, `csv` - * Images: `jpg`, `jpeg`, `png`, `bmp`, `tiff`, `tif`, `heif` - * Video: `mp4`, `mov`, `avi`, `wmv`, `mkv`, `webm` - * Audio: `mp3`, `wav`, `ogg`, `aac`, `flac`, `m4a` - -## Demos - -ADD DEMOS HERE \ No newline at end of file + - **Text**: `txt`, `md`, `html`, `json`, `xml`, `yaml`, `yml`, `log` + - **Documents**: `pdf`, `doc`, `docm`, `docx`, `pptx`, `xlsx`, `xlsm`, `xls`, `csv` + - **Images**: `jpg`, `jpeg`, `png`, `bmp`, `tiff`, `tif`, `heif` + - **Video**: `mp4`, `mov`, `avi`, `wmv`, `mkv`, `flv`, `mxf`, `gxf`, `ts`, `ps`, `3gp`, `3gpp`, `mpg`, `asf`, `m4v`, `isma`, `ismv`, `dvr-ms` + - **Audio**: `wav`, `m4a` \ No newline at end of file diff --git a/application/single_app/agent_logging_chat_completion_backup.py b/application/single_app/agent_logging_chat_completion_backup.py deleted file mode 100644 index 96afacf8..00000000 --- a/application/single_app/agent_logging_chat_completion_backup.py +++ /dev/null @@ -1,481 +0,0 @@ - - -import json -from pydantic import Field -from semantic_kernel.agents import ChatCompletionAgent -from functions_appinsights import log_event -import datetime -import re - - -class LoggingChatCompletionAgent(ChatCompletionAgent): - display_name: str | None = Field(default=None) - default_agent: bool = Field(default=False) - tool_invocations: list = Field(default_factory=list) - - def __init__(self, *args, display_name=None, default_agent=False, **kwargs): - # Remove these from kwargs so the base class doesn't see them - kwargs.pop('display_name', None) - kwargs.pop('default_agent', None) - super().__init__(*args, **kwargs) - self.display_name = display_name - self.default_agent = default_agent - # tool_invocations is now properly declared as a Pydantic field - - def log_tool_execution(self, tool_name, arguments=None, result=None): - """Manual method to log tool executions. Can be called by plugins.""" - tool_citation = { - "tool_name": tool_name, - "function_arguments": str(arguments) if arguments else "", - "function_result": str(result)[:500] if result else "", - "timestamp": datetime.datetime.utcnow().isoformat() - } - self.tool_invocations.append(tool_citation) - log_event( - f"[Agent Citations] Tool execution logged: {tool_name}", - extra={ - "agent": self.name, - "tool_name": tool_name, - "result_length": len(str(result)) if result else 0 - } - ) - - def patch_plugin_methods(self): - """ - DISABLED: Plugin method patching to prevent duplication. - Plugin logging is now handled by the @plugin_function_logger decorator system. - Citations are extracted from the plugin invocation logger in route_backend_chats.py. - """ - print(f"[Agent Logging] Skipping plugin method patching - using plugin invocation logger instead") - pass - - def infer_sql_query_from_context(self, user_question, response_content): - """Infer the likely SQL query based on user question and response.""" - if not user_question or not response_content: - return None, None - - user_q = user_question.lower() - response = response_content.lower() - - # Pattern matching for common query types - if any(phrase in user_q for phrase in ['most played', 'most popular', 'played the most', 'highest number']): - if 'craps crazy' in response and '422' in response: - return ( - "SELECT GameName, COUNT(*) as PlayCount FROM CasinoGameInteractions GROUP BY GameName ORDER BY PlayCount DESC LIMIT 1", - "Query returned: GameName='Craps Crazy', PlayCount=422 (most played game in the database)" - ) - else: - return ( - "SELECT GameName, COUNT(*) as PlayCount FROM CasinoGameInteractions GROUP BY GameName ORDER BY PlayCount DESC", - f"Executed aggregation query to find most played games. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['least played', 'least popular', 'played the least']): - return ( - "SELECT GameName, COUNT(*) as PlayCount FROM CasinoGameInteractions GROUP BY GameName ORDER BY PlayCount ASC LIMIT 1", - f"Query to find least played game. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['total', 'count', 'how many']): - if 'game' in user_q: - return ( - "SELECT COUNT(DISTINCT GameName) as TotalGames FROM CasinoGameInteractions", - f"Count query executed. Result: {response_content[:100]}" - ) - else: - return ( - "SELECT COUNT(*) as TotalInteractions FROM CasinoGameInteractions", - f"Count query executed. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['average', 'mean']): - if any(word in user_q for word in ['bet', 'wager']): - return ( - "SELECT AVG(BetAmount) as AvgBet FROM CasinoGameInteractions WHERE BetAmount IS NOT NULL", - f"Average bet calculation. Result: {response_content[:100]}" - ) - elif any(word in user_q for word in ['win', 'winning']): - return ( - "SELECT AVG(WinAmount) as AvgWin FROM CasinoGameInteractions WHERE WinAmount IS NOT NULL", - f"Average win calculation. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['list', 'show', 'what are']): - if 'game' in user_q: - return ( - "SELECT DISTINCT GameName FROM CasinoGameInteractions ORDER BY GameName", - f"List of games query. Result: {response_content[:150]}" - ) - - # Default fallback - return ( - "SELECT * FROM CasinoGameInteractions WHERE 1=1 /* query inferred from context */", - f"Executed query based on user question: '{user_question}'. Result: {response_content[:100]}" - ) - - def extract_tool_invocations_from_history(self, chat_history): - """Extract tool invocations from chat history for citations.""" - tool_citations = [] - - if not chat_history: - return tool_citations - - try: - # Iterate through chat history to find function calls and responses - for message in chat_history: - # Check if message has function calls in various formats - if hasattr(message, 'items') and message.items: - for item in message.items: - # Look for function call content (standard SK format) - if hasattr(item, 'function_name') and hasattr(item, 'function_result'): - tool_citation = { - "tool_name": item.function_name, - "function_arguments": str(getattr(item, 'arguments', {})), - "function_result": str(item.function_result)[:500], # Limit result size - "timestamp": datetime.datetime.utcnow().isoformat() - } - tool_citations.append(tool_citation) - # Alternative: Check for function call in content - elif hasattr(item, 'function_call'): - func_call = item.function_call - tool_citation = { - "tool_name": getattr(func_call, 'name', 'unknown'), - "function_arguments": str(getattr(func_call, 'arguments', {})), - "function_result": "Function called", - "timestamp": datetime.datetime.utcnow().isoformat() - } - tool_citations.append(tool_citation) - # Check for function result content type - elif hasattr(item, 'content_type') and item.content_type == 'function_result': - tool_citation = { - "tool_name": getattr(item, 'name', 'unknown_function'), - "function_arguments": "", - "function_result": str(getattr(item, 'text', ''))[:500], - "timestamp": datetime.datetime.utcnow().isoformat() - } - tool_citations.append(tool_citation) - - # Check for function calls in message metadata or inner content - if hasattr(message, 'metadata') and message.metadata: - # Look for function call metadata - for key, value in message.metadata.items(): - if 'function' in key.lower() or 'tool' in key.lower(): - tool_citation = { - "tool_name": f"metadata_{key}", - "function_arguments": "", - "function_result": str(value)[:500], - "timestamp": datetime.datetime.utcnow().isoformat() - } - tool_citations.append(tool_citation) - - # Check message role for tool/function messages - if hasattr(message, 'role') and hasattr(message, 'name'): - if message.role.value in ['tool', 'function']: - tool_citation = { - "tool_name": message.name or 'unknown_tool', - "function_arguments": "", - "function_result": str(getattr(message, 'content', ''))[:500], - "timestamp": datetime.datetime.utcnow().isoformat() - } - tool_citations.append(tool_citation) - - # Check for tool content in message content - if hasattr(message, 'content') and isinstance(message.content, str): - # Look for tool execution patterns in content - if "function_name:" in message.content or "tool_name:" in message.content: - # Extract tool information from content - tool_citation = { - "tool_name": "extracted_from_content", - "function_arguments": "", - "function_result": message.content[:500], - "timestamp": datetime.datetime.utcnow().isoformat() - } - tool_citations.append(tool_citation) - - except Exception as e: - log_event( - "[Agent Citations] Error extracting tool invocations from chat history", - extra={"agent": self.name, "error": str(e)}, - level="WARNING" - ) - - return tool_citations - - async def invoke(self, *args, **kwargs): - # Clear previous tool invocations - self.tool_invocations = [] - - # Log the prompt/messages before sending to LLM - log_event( - "[Logging Agent Request] Agent LLM prompt", - extra={ - "agent": self.name, - "prompt": [m.content[:30] for m in args[0]] if args else None - } - ) - - print(f"[Logging Agent Request] Agent: {self.name}") - print(f"[Logging Agent Request] Prompt: {[m.content[:30] for m in args[0]] if args else None}") - - # Store user question context for better tool detection - if args and args[0] and hasattr(args[0][-1], 'content'): - self._user_question = args[0][-1].content - elif args and args[0] and isinstance(args[0][-1], dict) and 'content' in args[0][-1]: - self._user_question = args[0][-1]['content'] - - # Apply patching to capture function calls - try: - self.patch_plugin_methods() - except Exception as e: - log_event(f"[Agent Citations] Error applying plugin patches: {e}", level="WARNING") - - response = None - try: - # Store initial message count to detect new messages from tool usage - initial_message_count = len(args[0]) if args and args[0] else 0 - result = super().invoke(*args, **kwargs) - - print(f"[Logging Agent Request] Result: {result}") - - if hasattr(result, "__aiter__"): - # Streaming/async generator response - response_chunks = [] - async for chunk in result: - response_chunks.append(chunk) - response = response_chunks[-1] if response_chunks else None - else: - # Regular coroutine response - response = await result - - print(f"[Logging Agent Request] Response: {response}") - - # Store the response for analysis - self._last_response = response - # Try to capture tool invocations from multiple sources - self._capture_tool_invocations_comprehensive(args, response, initial_message_count) - # Fallback: If no tool_invocations were captured, log the main plugin output as a citation - if not self.tool_invocations and response and hasattr(response, 'content'): - self.tool_invocations.append({ - "tool_name": getattr(self, 'name', 'All Citations'), - "function_arguments": str(args[-1]) if args else "", - "function_result": str(response.content)[:500], - "timestamp": datetime.datetime.utcnow().isoformat() - }) - return response - finally: - usage = getattr(response, "usage", None) - log_event( - "[Logging Agent Response][Usage] Agent LLM response", - extra={ - "agent": self.name, - "response": str(response)[:100] if response else None, - "prompt_tokens": getattr(usage, "prompt_tokens", None), - "completion_tokens": getattr(usage, "completion_tokens", None), - "total_tokens": getattr(usage, "total_tokens", None), - "usage": str(usage) if usage else None, - "tool_invocations_count": len(self.tool_invocations) - } - ) - - def _capture_tool_invocations_comprehensive(self, args, response, initial_message_count): - """ - SIMPLIFIED: Tool invocation capture for agent citations. - Most citation data now comes from the plugin invocation logger system. - This method only provides basic fallback logging for edge cases. - """ - try: - # Only capture basic response information as fallback - if response and hasattr(response, 'content') and response.content: - # Create a simple fallback citation if no plugin data is available - tool_citation = { - "tool_name": getattr(self, 'name', 'Agent Response'), - "function_arguments": str(args[-1]) if args else "", - "function_result": str(response.content)[:500], - "timestamp": datetime.datetime.utcnow().isoformat() - } - # Only add if we don't already have tool invocations - if not self.tool_invocations: - self.tool_invocations.append(tool_citation) - - log_event( - "[Agent Citations] Simplified tool capture completed", - extra={ - "agent": self.name, - "fallback_citations": len(self.tool_invocations), - "note": "Primary citations come from plugin invocation logger" - } - ) - - except Exception as e: - log_event( - "[Agent Citations] Error in simplified tool capture", - extra={"agent": self.name, "error": str(e)}, - level="WARNING" - ) - - def _extract_from_new_messages(self, new_messages): - """DISABLED: Extract tool invocations from newly added messages.""" - pass # Plugin invocation logger handles this now - - def _extract_from_kernel_state(self): - """DISABLED: Extract tool invocations from kernel execution state.""" - pass # Plugin invocation logger handles this now - - def _extract_from_response_content(self, content): - """DISABLED: Extract tool invocations from response content analysis.""" - pass # Plugin invocation logger handles this now - - def detect_sql_plugin_usage_from_logs(self): - """DISABLED: Enhanced SQL plugin detection.""" - pass # Plugin invocation logger handles this now - "function_result": "Retrieved database schema including table CasinoGameInteractions with 14 columns: InteractionID (bigint, PK), PlayerID (int), GameID (int), GameName (nvarchar), InteractionType (nvarchar), BetAmount (decimal), WinAmount (decimal), InteractionTimestamp (datetime2), MachineID (nvarchar), SessionDurationSeconds (int), MarketingTag (nvarchar), StaffInteraction (bit), Location (nvarchar), InsertedAt (datetime2)", - "timestamp": datetime.datetime.utcnow().isoformat() - }) - sql_tools_detected.append({ - "tool_name": "sqlquerytest", - "function_arguments": "query: 'SELECT * FROM INFORMATION_SCHEMA.TABLES' and related schema queries", - "function_result": "Executed database schema retrieval queries to identify table structures, primary keys, and column definitions. Found 1 primary table: CasinoGameInteractions", - "timestamp": datetime.datetime.utcnow().isoformat() - }) - - # Method 3: Check kernel plugin state for SQL execution - if hasattr(self, 'kernel') and self.kernel and hasattr(self.kernel, 'plugins'): - for plugin_name, plugin in self.kernel.plugins.items(): - if 'sql' in plugin_name.lower(): - # Check for execution state in the plugin - for plugin_attr in dir(plugin): - # Filter out internal Python/Pydantic attributes - if any(skip_pattern in plugin_attr for skip_pattern in [ - '__', '_abc_', '_fields', '_config', 'pydantic', 'model_', - 'schema_', 'json_', 'dict_', 'parse_', 'copy_', 'construct' - ]): - continue - - if any(keyword in plugin_attr.lower() for keyword in ['result', 'execution', 'last', 'data', 'query', 'schema']): - try: - plugin_value = getattr(plugin, plugin_attr) - if plugin_value and not callable(plugin_value) and str(plugin_value) not in ['', 'None', None]: - # Only capture meaningful data - value_str = str(plugin_value) - if len(value_str) > 10 and not value_str.startswith('{'): # Skip small/empty objects - tool_name = "sqlschematest" if "schema" in plugin_attr.lower() else "sqlquerytest" - sql_tools_detected.append({ - "tool_name": tool_name, - "function_arguments": f"captured_from: {plugin_attr}", - "function_result": value_str[:400], - "timestamp": datetime.datetime.utcnow().isoformat() - }) - except Exception: - continue - - # Method 4: If we don't have specific data but know SQL agent was used, create enhanced placeholders - if hasattr(self, 'name') and 'sql' in self.name.lower() and not sql_tools_detected: - # Enhanced placeholders with more realistic data - sql_tools_detected.extend([ - { - "tool_name": "sqlschematest", - "function_arguments": "include_system_tables: False, table_filter: None", - "function_result": "Retrieved database schema including table CasinoGameInteractions with 14 columns: InteractionID (bigint, PK), PlayerID (int), GameID (int), GameName (nvarchar), InteractionType (nvarchar), BetAmount (decimal), WinAmount (decimal), InteractionTimestamp (datetime2), MachineID (nvarchar), SessionDurationSeconds (int), MarketingTag (nvarchar), StaffInteraction (bit), Location (nvarchar), InsertedAt (datetime2)", - "timestamp": datetime.datetime.utcnow().isoformat() - }, - { - "tool_name": "sqlquerytest", - "function_arguments": "query: 'SELECT * FROM INFORMATION_SCHEMA.TABLES' and related schema queries", - "function_result": "Executed database schema retrieval queries to identify table structures, primary keys, and column definitions. Found 1 primary table: CasinoGameInteractions", - "timestamp": datetime.datetime.utcnow().isoformat() - } - ]) - - self.tool_invocations.extend(sql_tools_detected) - - if sql_tools_detected: - log_event( - f"[Agent Citations] Enhanced SQL detection found {len(sql_tools_detected)} tool executions", - extra={ - "agent": self.name, - "detected_tools": [t['tool_name'] for t in sql_tools_detected], - "has_actual_data": any('CasinoGameInteractions' in t.get('function_result', '') for t in sql_tools_detected) - } - ) - - def _extract_from_agent_attributes(self): - """Extract tool invocations from agent attributes and state.""" - # Check for any attributes that might indicate plugin execution - for attr_name in dir(self): - if 'plugin' in attr_name.lower() or 'function' in attr_name.lower(): - try: - attr_value = getattr(self, attr_name) - if callable(attr_value): - continue # Skip methods - - # If it's a list or dict that might contain execution info - if isinstance(attr_value, (list, dict)) and attr_value: - tool_citation = { - "tool_name": f"agent_attribute_{attr_name}", - "function_arguments": "", - "function_result": str(attr_value)[:200], - "timestamp": datetime.datetime.utcnow().isoformat() - } - self.tool_invocations.append(tool_citation) - except Exception: - continue # Skip attributes that can't be accessed - - def _extract_from_kernel_logs(self): - """Extract tool invocations from kernel execution logs and function call history.""" - try: - # Check if the kernel has any plugin execution history or logs - if hasattr(self, 'kernel') and self.kernel: - # Check for plugin execution state - if hasattr(self.kernel, 'plugins') and self.kernel.plugins: - for plugin_name, plugin in self.kernel.plugins.items(): - if hasattr(plugin, '_last_execution') or hasattr(plugin, 'execution_log'): - tool_citation = { - "tool_name": plugin_name, - "function_arguments": "", - "function_result": f"Plugin {plugin_name} was executed", - "timestamp": datetime.datetime.utcnow().isoformat() - } - self.tool_invocations.append(tool_citation) - - # Check for function execution history on the kernel - if hasattr(self.kernel, 'function_invoking_handlers') or hasattr(self.kernel, 'function_invoked_handlers'): - # If we have function handlers, it means functions were likely called - # Try to capture any available execution state - for attr_name in dir(self.kernel): - if 'execute' in attr_name.lower() or 'invoke' in attr_name.lower(): - try: - attr_value = getattr(self.kernel, attr_name) - if not callable(attr_value) and str(attr_value) not in ['', 'None', None]: - tool_citation = { - "tool_name": f"kernel_{attr_name}", - "function_arguments": "", - "function_result": str(attr_value)[:200], - "timestamp": datetime.datetime.utcnow().isoformat() - } - self.tool_invocations.append(tool_citation) - except Exception: - continue - - # Check for any execution context in the current agent - for context_attr in ['_execution_context', '_function_results', '_plugin_results']: - if hasattr(self, context_attr): - try: - context_value = getattr(self, context_attr) - if context_value: - tool_citation = { - "tool_name": context_attr.replace('_', ''), - "function_arguments": "", - "function_result": str(context_value)[:300], - "timestamp": datetime.datetime.utcnow().isoformat() - } - self.tool_invocations.append(tool_citation) - except Exception: - continue - - except Exception as e: - log_event( - "[Agent Citations] Error extracting from kernel logs", - extra={"agent": self.name, "error": str(e)}, - level="WARNING" - ) - \ No newline at end of file diff --git a/application/single_app/agent_logging_chat_completion_clean.py b/application/single_app/agent_logging_chat_completion_clean.py deleted file mode 100644 index e1fd1834..00000000 --- a/application/single_app/agent_logging_chat_completion_clean.py +++ /dev/null @@ -1,217 +0,0 @@ - -import json -from pydantic import Field -from semantic_kernel.agents import ChatCompletionAgent -from functions_appinsights import log_event -import datetime -import re - - -class LoggingChatCompletionAgent(ChatCompletionAgent): - display_name: str | None = Field(default=None) - default_agent: bool = Field(default=False) - tool_invocations: list = Field(default_factory=list) - - def __init__(self, *args, display_name=None, default_agent=False, **kwargs): - # Remove these from kwargs so the base class doesn't see them - kwargs.pop('display_name', None) - kwargs.pop('default_agent', None) - super().__init__(*args, **kwargs) - self.display_name = display_name - self.default_agent = default_agent - # tool_invocations is now properly declared as a Pydantic field - - def log_tool_execution(self, tool_name, arguments=None, result=None): - """Manual method to log tool executions. Can be called by plugins.""" - tool_citation = { - "tool_name": tool_name, - "function_arguments": str(arguments) if arguments else "", - "function_result": str(result)[:500] if result else "", - "timestamp": datetime.datetime.utcnow().isoformat() - } - self.tool_invocations.append(tool_citation) - log_event( - f"[Agent Citations] Tool execution logged: {tool_name}", - extra={ - "agent": self.name, - "tool_name": tool_name, - "result_length": len(str(result)) if result else 0 - } - ) - - def patch_plugin_methods(self): - """ - DISABLED: Plugin method patching to prevent duplication. - Plugin logging is now handled by the @plugin_function_logger decorator system. - Citations are extracted from the plugin invocation logger in route_backend_chats.py. - """ - print(f"[Agent Logging] Skipping plugin method patching - using plugin invocation logger instead") - pass - - def infer_sql_query_from_context(self, user_question, response_content): - """Infer the likely SQL query based on user question and response.""" - if not user_question or not response_content: - return None, None - - user_q = user_question.lower() - response = response_content.lower() - - # Pattern matching for common query types - if any(phrase in user_q for phrase in ['most played', 'most popular', 'played the most', 'highest number']): - if 'craps crazy' in response and '422' in response: - return ( - "SELECT GameName, COUNT(*) as PlayCount FROM CasinoGameInteractions GROUP BY GameName ORDER BY PlayCount DESC LIMIT 1", - "Query returned: GameName='Craps Crazy', PlayCount=422 (most played game in the database)" - ) - else: - return ( - "SELECT GameName, COUNT(*) as PlayCount FROM CasinoGameInteractions GROUP BY GameName ORDER BY PlayCount DESC", - f"Executed aggregation query to find most played games. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['least played', 'least popular', 'played the least']): - return ( - "SELECT GameName, COUNT(*) as PlayCount FROM CasinoGameInteractions GROUP BY GameName ORDER BY PlayCount ASC LIMIT 1", - f"Query to find least played game. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['total', 'count', 'how many']): - if 'game' in user_q: - return ( - "SELECT COUNT(DISTINCT GameName) as TotalGames FROM CasinoGameInteractions", - f"Count query executed. Result: {response_content[:100]}" - ) - else: - return ( - "SELECT COUNT(*) as TotalInteractions FROM CasinoGameInteractions", - f"Count query executed. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['average', 'mean']): - if any(word in user_q for word in ['bet', 'wager']): - return ( - "SELECT AVG(BetAmount) as AvgBet FROM CasinoGameInteractions WHERE BetAmount IS NOT NULL", - f"Average bet calculation. Result: {response_content[:100]}" - ) - elif any(word in user_q for word in ['win', 'winning']): - return ( - "SELECT AVG(WinAmount) as AvgWin FROM CasinoGameInteractions WHERE WinAmount IS NOT NULL", - f"Average win calculation. Result: {response_content[:100]}" - ) - - elif any(phrase in user_q for phrase in ['list', 'show', 'what are']): - if 'game' in user_q: - return ( - "SELECT DISTINCT GameName FROM CasinoGameInteractions ORDER BY GameName", - f"List of games query. Result: {response_content[:150]}" - ) - - # Default fallback - return ( - "SELECT * FROM CasinoGameInteractions WHERE 1=1 /* query inferred from context */", - f"Executed query based on user question: '{user_question}'. Result: {response_content[:100]}" - ) - - def extract_tool_invocations_from_history(self, chat_history): - """ - SIMPLIFIED: Extract tool invocations from chat history for citations. - Most citation data now comes from the plugin invocation logger system. - """ - return [] # Plugin invocation logger handles this now - - async def invoke(self, *args, **kwargs): - # Clear previous tool invocations - self.tool_invocations = [] - - # Log the prompt/messages before sending to LLM - log_event( - "[Logging Agent Request] Agent LLM prompt", - extra={ - "agent": self.name, - "prompt": [m.content[:30] for m in args[0]] if args else None - } - ) - - print(f"[Logging Agent Request] Agent: {self.name}") - print(f"[Logging Agent Request] Prompt: {[m.content[:30] for m in args[0]] if args else None}") - - # Store user question context for better tool detection - if args and args[0] and hasattr(args[0][-1], 'content'): - self._user_question = args[0][-1].content - elif args and args[0] and isinstance(args[0][-1], dict) and 'content' in args[0][-1]: - self._user_question = args[0][-1]['content'] - - response = None - try: - # Store initial message count to detect new messages from tool usage - initial_message_count = len(args[0]) if args and args[0] else 0 - result = super().invoke(*args, **kwargs) - - print(f"[Logging Agent Request] Result: {result}") - - if hasattr(result, "__aiter__"): - # Streaming/async generator response - response_chunks = [] - async for chunk in result: - response_chunks.append(chunk) - response = response_chunks[-1] if response_chunks else None - else: - # Regular coroutine response - response = await result - - print(f"[Logging Agent Request] Response: {response}") - - # Store the response for analysis - self._last_response = response - # Simplified citation capture - primary citations come from plugin invocation logger - self._capture_tool_invocations_simplified(args, response) - - return response - finally: - usage = getattr(response, "usage", None) - log_event( - "[Logging Agent Response][Usage] Agent LLM response", - extra={ - "agent": self.name, - "response": str(response)[:100] if response else None, - "prompt_tokens": getattr(usage, "prompt_tokens", None), - "completion_tokens": getattr(usage, "completion_tokens", None), - "total_tokens": getattr(usage, "total_tokens", None), - "usage": str(usage) if usage else None, - "fallback_citations": len(self.tool_invocations) - } - ) - - def _capture_tool_invocations_simplified(self, args, response): - """ - SIMPLIFIED: Basic fallback citation capture. - Primary citations come from the plugin invocation logger system. - This only provides basic response logging for edge cases. - """ - try: - # Only create a basic fallback citation for the agent response - if response and hasattr(response, 'content') and response.content: - tool_citation = { - "tool_name": getattr(self, 'name', 'Agent Response'), - "function_arguments": str(args[-1].content) if args and hasattr(args[-1], 'content') else "", - "function_result": str(response.content)[:500], - "timestamp": datetime.datetime.utcnow().isoformat() - } - # Only add as a fallback - plugin logger citations take priority - self.tool_invocations.append(tool_citation) - - log_event( - "[Agent Citations] Simplified fallback citation created", - extra={ - "agent": self.name, - "fallback_citations": len(self.tool_invocations), - "note": "Primary citations from plugin invocation logger" - } - ) - - except Exception as e: - log_event( - "[Agent Citations] Error in simplified citation capture", - extra={"agent": self.name, "error": str(e)}, - level="WARNING" - ) diff --git a/application/single_app/app.py b/application/single_app/app.py index 3f023956..c1dc124f 100644 --- a/application/single_app/app.py +++ b/application/single_app/app.py @@ -36,6 +36,7 @@ from route_frontend_public_workspaces import * from route_frontend_safety import * from route_frontend_feedback import * +from route_frontend_notifications import * from route_backend_chats import * from route_backend_conversations import * @@ -50,11 +51,15 @@ from route_backend_prompts import * from route_backend_group_prompts import * from route_backend_control_center import * +from route_backend_notifications import * +from route_backend_retention_policy import * from route_backend_plugins import bpap as admin_plugins_bp, bpdp as dynamic_plugins_bp from route_backend_agents import bpa as admin_agents_bp from route_backend_public_workspaces import * from route_backend_public_documents import * from route_backend_public_prompts import * +from route_backend_speech import register_route_backend_speech +from route_backend_tts import register_route_backend_tts from route_enhanced_citations import register_enhanced_citations_routes from plugin_validation_endpoint import plugin_validation_bp from route_openapi import register_openapi_routes @@ -62,8 +67,14 @@ from route_plugin_logging import bpl as plugin_logging_bp from functions_debug import debug_print +from opentelemetry.instrumentation.flask import FlaskInstrumentor + app = Flask(__name__, static_url_path='/static', static_folder='static') +disable_flask_instrumentation = os.environ.get("DISABLE_FLASK_INSTRUMENTATION", "0") +if not (disable_flask_instrumentation == "1" or disable_flask_instrumentation.lower() == "true"): + FlaskInstrumentor().instrument_app(app) + app.config['EXECUTOR_TYPE'] = EXECUTOR_TYPE app.config['EXECUTOR_MAX_WORKERS'] = EXECUTOR_MAX_WORKERS executor = Executor() @@ -95,6 +106,12 @@ # Register Enhanced Citations routes register_enhanced_citations_routes(app) +# Register Speech routes +register_route_backend_speech(app) + +# Register TTS routes +register_route_backend_tts(app) + # Register Swagger documentation routes from swagger_wrapper import register_swagger_routes register_swagger_routes(app) @@ -121,38 +138,54 @@ def configure_sessions(settings): redis_auth_type = settings.get('redis_auth_type', 'key').strip().lower() if redis_url: - app.config['SESSION_TYPE'] = 'redis' - if redis_auth_type == 'managed_identity': - print("Redis enabled using Managed Identity") - from config import get_redis_cache_infrastructure_endpoint - credential = DefaultAzureCredential() - redis_hostname = redis_url.split('.')[0] - cache_endpoint = get_redis_cache_infrastructure_endpoint(redis_hostname) - token = credential.get_token(cache_endpoint) - app.config['SESSION_REDIS'] = Redis( - host=redis_url, - port=6380, - db=0, - password=token.token, - ssl=True - ) - else: - redis_key = settings.get('redis_key', '').strip() - print("Redis enabled using Access Key") - app.config['SESSION_REDIS'] = Redis( - host=redis_url, - port=6380, - db=0, - password=redis_key, - ssl=True - ) + redis_client = None + try: + if redis_auth_type == 'managed_identity': + print("Redis enabled using Managed Identity") + from config import get_redis_cache_infrastructure_endpoint + credential = DefaultAzureCredential() + redis_hostname = redis_url.split('.')[0] + cache_endpoint = get_redis_cache_infrastructure_endpoint(redis_hostname) + token = credential.get_token(cache_endpoint) + redis_client = Redis( + host=redis_url, + port=6380, + db=0, + password=token.token, + ssl=True, + socket_connect_timeout=5, + socket_timeout=5 + ) + else: + redis_key = settings.get('redis_key', '').strip() + print("Redis enabled using Access Key") + redis_client = Redis( + host=redis_url, + port=6380, + db=0, + password=redis_key, + ssl=True, + socket_connect_timeout=5, + socket_timeout=5 + ) + + # Test the connection + redis_client.ping() + print("✅ Redis connection successful") + app.config['SESSION_TYPE'] = 'redis' + app.config['SESSION_REDIS'] = redis_client + + except Exception as redis_error: + print(f"⚠️ WARNING: Redis connection failed: {redis_error}") + print("Falling back to filesystem sessions for reliability") + app.config['SESSION_TYPE'] = 'filesystem' else: print("Redis enabled but URL missing; falling back to filesystem.") app.config['SESSION_TYPE'] = 'filesystem' else: app.config['SESSION_TYPE'] = 'filesystem' except Exception as e: - print(f"WARNING: Session configuration error; falling back to filesystem: {e}") + print(f"⚠️ WARNING: Session configuration error; falling back to filesystem: {e}") app.config['SESSION_TYPE'] = 'filesystem' # Initialize session interface @@ -242,6 +275,86 @@ def check_logging_timers(): timer_thread.start() print("Logging timer background task started.") + # Background task to check for expired approval requests + def check_expired_approvals(): + """Background task that checks for expired approval requests and auto-denies them""" + while True: + try: + from functions_approvals import auto_deny_expired_approvals + denied_count = auto_deny_expired_approvals() + if denied_count > 0: + print(f"Auto-denied {denied_count} expired approval request(s).") + except Exception as e: + print(f"Error in approval expiration check: {e}") + + # Check every 6 hours (21600 seconds) + time.sleep(21600) + + # Start the approval expiration check thread + approval_thread = threading.Thread(target=check_expired_approvals, daemon=True) + approval_thread.start() + print("Approval expiration background task started.") + + # Background task to check retention policy execution time + def check_retention_policy(): + """Background task that executes retention policy at scheduled time""" + while True: + try: + settings = get_settings() + + # Check if any retention policy is enabled + personal_enabled = settings.get('enable_retention_policy_personal', False) + group_enabled = settings.get('enable_retention_policy_group', False) + public_enabled = settings.get('enable_retention_policy_public', False) + + if personal_enabled or group_enabled or public_enabled: + current_time = datetime.now(timezone.utc) + execution_hour = settings.get('retention_policy_execution_hour', 2) + + # Check if we're in the execution hour + if current_time.hour == execution_hour: + # Check if we haven't run today yet + last_run = settings.get('retention_policy_last_run') + should_run = False + + if last_run: + try: + last_run_dt = datetime.fromisoformat(last_run) + # Run if last run was more than 23 hours ago + if (current_time - last_run_dt).total_seconds() > (23 * 3600): + should_run = True + except: + should_run = True + else: + should_run = True + + if should_run: + print(f"Executing scheduled retention policy at {current_time.isoformat()}") + from functions_retention_policy import execute_retention_policy + results = execute_retention_policy(manual_execution=False) + + if results.get('success'): + print(f"Retention policy execution completed: " + f"{results['personal']['conversations']} personal conversations, " + f"{results['personal']['documents']} personal documents, " + f"{results['group']['conversations']} group conversations, " + f"{results['group']['documents']} group documents, " + f"{results['public']['conversations']} public conversations, " + f"{results['public']['documents']} public documents deleted.") + else: + print(f"Retention policy execution failed: {results.get('errors')}") + + except Exception as e: + print(f"Error in retention policy check: {e}") + + # Check every hour + time.sleep(3600) + + # Start the retention policy check thread + retention_thread = threading.Thread(target=check_retention_policy, daemon=True) + retention_thread.start() + print("Retention policy background task started.") + # Initialize Semantic Kernel and plugins enable_semantic_kernel = settings.get('enable_semantic_kernel', False) per_user_semantic_kernel = settings.get('per_user_semantic_kernel', False) @@ -330,6 +443,7 @@ def markdown_filter(text): # =================== Default Routes ===================== @app.route('/') +@swagger_route(security=get_auth_security()) def index(): settings = get_settings() public_settings = sanitize_settings_for_user(settings) @@ -343,14 +457,17 @@ def index(): return render_template('index.html', app_settings=public_settings, landing_html=landing_html) @app.route('/robots933456.txt') +@swagger_route(security=get_auth_security()) def robots(): return send_from_directory('static', 'robots.txt') @app.route('/favicon.ico') +@swagger_route(security=get_auth_security()) def favicon(): return send_from_directory('static', 'favicon.ico') @app.route('/static/js/') +@swagger_route(security=get_auth_security()) def serve_js_modules(filename): """Serve JavaScript modules with correct MIME type.""" from flask import send_from_directory, Response @@ -363,10 +480,12 @@ def serve_js_modules(filename): return send_from_directory('static/js', filename) @app.route('/acceptable_use_policy.html') +@swagger_route(security=get_auth_security()) def acceptable_use_policy(): return render_template('acceptable_use_policy.html') @app.route('/api/semantic-kernel/plugins') +@swagger_route(security=get_auth_security()) def list_semantic_kernel_plugins(): """Test endpoint: List loaded Semantic Kernel plugins and their functions.""" global kernel @@ -413,6 +532,9 @@ def list_semantic_kernel_plugins(): # ------------------- Feedback Routes ------------------- register_route_frontend_feedback(app) +# ------------------- Notifications Routes -------------- +register_route_frontend_notifications(app) + # ------------------- API Chat Routes -------------------- register_route_backend_chats(app) @@ -452,6 +574,12 @@ def list_semantic_kernel_plugins(): # ------------------- API Control Center Routes --------- register_route_backend_control_center(app) +# ------------------- API Notifications Routes ---------- +register_route_backend_notifications(app) + +# ------------------- API Retention Policy Routes -------- +register_route_backend_retention_policy(app) + # ------------------- API Public Workspaces Routes ------- register_route_backend_public_workspaces(app) diff --git a/application/single_app/config.py b/application/single_app/config.py index f139bbf3..32f95593 100644 --- a/application/single_app/config.py +++ b/application/single_app/config.py @@ -64,7 +64,6 @@ from io import BytesIO from typing import List -import azure.cognitiveservices.speech as speechsdk from azure.cosmos import CosmosClient, PartitionKey, exceptions from azure.cosmos.exceptions import CosmosResourceNotFoundError from azure.core.credentials import AzureKeyCredential @@ -89,7 +88,7 @@ EXECUTOR_TYPE = 'thread' EXECUTOR_MAX_WORKERS = 30 SESSION_TYPE = 'filesystem' -VERSION = "0.233.318" +VERSION = "0.235.003" SECRET_KEY = os.getenv('SECRET_KEY', 'dev-secret-key-change-in-production') @@ -102,10 +101,13 @@ 'Referrer-Policy': 'strict-origin-when-cross-origin', 'Content-Security-Policy': ( "default-src 'self'; " - "script-src 'self' 'unsafe-inline' 'unsafe-eval' https://cdn.jsdelivr.net https://code.jquery.com https://stackpath.bootstrapcdn.com; " - "style-src 'self' 'unsafe-inline' https://cdn.jsdelivr.net https://stackpath.bootstrapcdn.com; " + "script-src 'self' 'unsafe-inline' 'unsafe-eval'; " + #"script-src 'self' 'unsafe-inline' 'unsafe-eval' https://cdn.jsdelivr.net https://code.jquery.com https://stackpath.bootstrapcdn.com; " + "style-src 'self' 'unsafe-inline'; " + #"style-src 'self' 'unsafe-inline' https://cdn.jsdelivr.net https://stackpath.bootstrapcdn.com; " "img-src 'self' data: https: blob:; " - "font-src 'self' https://cdn.jsdelivr.net https://stackpath.bootstrapcdn.com; " + "font-src 'self'; " + #"font-src 'self' https://cdn.jsdelivr.net https://stackpath.bootstrapcdn.com; " "connect-src 'self' https: wss: ws:; " "media-src 'self' blob:; " "object-src 'none'; " @@ -185,7 +187,6 @@ credential_scopes=[resource_manager + "/.default"] cognitive_services_scope = "https://cognitiveservices.azure.com/.default" video_indexer_endpoint = "https://api.videoindexer.ai" - search_resource_manager = "https://search.azure.com" KEY_VAULT_DOMAIN = ".vault.azure.net" def get_redis_cache_infrastructure_endpoint(redis_hostname: str) -> str: @@ -394,6 +395,20 @@ def get_redis_cache_infrastructure_endpoint(redis_hostname: str) -> str: partition_key=PartitionKey(path="/user_id") ) +cosmos_notifications_container_name = "notifications" +cosmos_notifications_container = cosmos_database.create_container_if_not_exists( + id=cosmos_notifications_container_name, + partition_key=PartitionKey(path="/user_id"), + default_ttl=-1 # TTL disabled by default, enabled per-document +) + +cosmos_approvals_container_name = "approvals" +cosmos_approvals_container = cosmos_database.create_container_if_not_exists( + id=cosmos_approvals_container_name, + partition_key=PartitionKey(path="/group_id"), + default_ttl=-1 # TTL disabled by default, enabled per-document for auto-cleanup +) + def ensure_custom_logo_file_exists(app, settings): """ If custom_logo_base64 or custom_logo_dark_base64 is present in settings, ensure the appropriate diff --git a/application/single_app/functions_activity_logging.py b/application/single_app/functions_activity_logging.py index 5112cfbb..fb005f06 100644 --- a/application/single_app/functions_activity_logging.py +++ b/application/single_app/functions_activity_logging.py @@ -9,13 +9,9 @@ from datetime import datetime from typing import Optional from functions_appinsights import log_event +from functions_debug import debug_print from config import cosmos_activity_logs_container -# Debug print function for logging -def debug_print(message): - """Print debug messages to console.""" - print(message) - def log_chat_activity( user_id: str, conversation_id: str, @@ -58,6 +54,7 @@ def log_chat_activity( }, level=logging.INFO ) + debug_print(f"Logged chat activity: {message_type} for user {user_id}") except Exception as e: # Log error but don't break the chat flow @@ -70,6 +67,7 @@ def log_chat_activity( }, level=logging.ERROR ) + debug_print(f"Error logging chat activity for user {user_id}: {str(e)}") def log_user_activity( @@ -104,6 +102,7 @@ def log_user_activity( extra=activity_data, level=logging.INFO ) + debug_print(f"Logged user activity: {activity_type} for user {user_id}") except Exception as e: # Log error but don't break the user flow @@ -116,6 +115,7 @@ def log_user_activity( }, level=logging.ERROR ) + debug_print(f"Error logging user activity for user {user_id}: {str(e)}") def log_document_upload( @@ -151,6 +151,7 @@ def log_document_upload( }, level=logging.INFO ) + debug_print(f"Logged document upload for user {user_id}") except Exception as e: # Log error but don't break the upload flow @@ -163,6 +164,7 @@ def log_document_upload( }, level=logging.ERROR ) + debug_print(f"Error logging document upload for user {user_id}: {str(e)}") def log_document_creation_transaction( @@ -265,8 +267,8 @@ def log_document_creation_transaction( extra=activity_record, level=logging.INFO ) - - print(f"✅ Document creation transaction logged to activity_logs: {document_id}") + debug_print(f"Logged document creation transaction: {document_id} for user {user_id}") + except Exception as e: # Log error but don't break the document creation flow @@ -280,7 +282,7 @@ def log_document_creation_transaction( }, level=logging.ERROR ) - print(f"⚠️ Warning: Failed to log document creation transaction: {str(e)}") + debug_print(f"Error logging document creation transaction for user {user_id}: {str(e)}") def log_document_deletion_transaction( @@ -353,7 +355,7 @@ def log_document_deletion_transaction( level=logging.INFO ) - print(f"✅ Document deletion transaction logged to activity_logs: {document_id}") + debug_print(f"Logged document deletion transaction: {document_id} for user {user_id}") except Exception as e: # Log error but don't break the document deletion flow @@ -367,7 +369,91 @@ def log_document_deletion_transaction( }, level=logging.ERROR ) - print(f"⚠️ Warning: Failed to log document deletion transaction: {str(e)}") + debug_print(f"Error logging document deletion transaction for user {user_id}: {str(e)}") + + +def log_document_metadata_update_transaction( + user_id: str, + document_id: str, + workspace_type: str, + file_name: str, + updated_fields: dict, + file_type: Optional[str] = None, + group_id: Optional[str] = None, + public_workspace_id: Optional[str] = None, + additional_metadata: Optional[dict] = None +) -> None: + """ + Log document metadata update transaction to activity_logs container. + This creates a permanent record of metadata modifications. + + Args: + user_id (str): The ID of the user who updated the metadata + document_id (str): The ID of the updated document + workspace_type (str): Type of workspace ('personal', 'group', 'public') + file_name (str): Name of the document file + updated_fields (dict): Dictionary of fields that were updated with their new values + file_type (str, optional): File extension/type (.pdf, .docx, etc.) + group_id (str, optional): Group ID if group workspace + public_workspace_id (str, optional): Public workspace ID if public workspace + additional_metadata (dict, optional): Any additional metadata to store + """ + + try: + import uuid + + # Create metadata update activity log record + activity_record = { + 'id': str(uuid.uuid4()), + 'user_id': user_id, + 'activity_type': 'document_metadata_update', + 'workspace_type': workspace_type, + 'timestamp': datetime.utcnow().isoformat(), + 'created_at': datetime.utcnow().isoformat(), + 'document': { + 'document_id': document_id, + 'file_name': file_name, + 'file_type': file_type + }, + 'updated_fields': updated_fields, + 'workspace_context': {} + } + + # Add workspace-specific context + if workspace_type == 'group' and group_id: + activity_record['workspace_context']['group_id'] = group_id + elif workspace_type == 'public' and public_workspace_id: + activity_record['workspace_context']['public_workspace_id'] = public_workspace_id + + # Add any additional metadata + if additional_metadata: + activity_record['additional_metadata'] = additional_metadata + + # Save to activity_logs container for permanent record + cosmos_activity_logs_container.create_item(body=activity_record) + + # Also log to Application Insights for monitoring + log_event( + message=f"Document metadata update transaction logged: {file_name} for user {user_id}", + extra=activity_record, + level=logging.INFO + ) + + debug_print(f"Logged document metadata update transaction: {document_id} for user {user_id}") + + except Exception as e: + # Log error but don't break the document update flow + log_event( + message=f"Error logging document metadata update transaction: {str(e)}", + extra={ + 'user_id': user_id, + 'document_id': document_id, + 'workspace_type': workspace_type, + 'error': str(e) + }, + level=logging.ERROR + ) + debug_print(f"Error logging document metadata update transaction for user {user_id}: {str(e)}") def log_token_usage( @@ -459,6 +545,7 @@ def log_token_usage( extra=activity_record, level=logging.INFO ) + debug_print(f"Logged token usage: {token_type} - {total_tokens} tokens for user {user_id}") except Exception as e: # Log error but don't break the flow @@ -472,6 +559,7 @@ def log_token_usage( }, level=logging.ERROR ) + debug_print(f"Error logging token usage for user {user_id}: {str(e)}") def log_conversation_creation( @@ -534,7 +622,7 @@ def log_conversation_creation( except Exception as e: # Non-blocking error handling debug_print(f"⚠️ Error logging conversation creation: {str(e)}") - log_to_blob( + log_event( message=f"Error logging conversation creation: {str(e)}", extra={ 'user_id': user_id, @@ -613,7 +701,7 @@ def log_conversation_deletion( except Exception as e: # Non-blocking error handling debug_print(f"⚠️ Error logging conversation deletion: {str(e)}") - log_to_blob( + log_event( message=f"Error logging conversation deletion: {str(e)}", extra={ 'user_id': user_id, @@ -684,7 +772,7 @@ def log_conversation_archival( except Exception as e: # Non-blocking error handling debug_print(f"⚠️ Error logging conversation archival: {str(e)}") - log_to_blob( + log_event( message=f"Error logging conversation archival: {str(e)}", extra={ 'user_id': user_id, @@ -732,6 +820,7 @@ def log_user_login( extra=login_activity, level=logging.INFO ) + debug_print(f"✅ User login activity logged for user {user_id}") except Exception as e: # Log error but don't break the login flow @@ -744,3 +833,250 @@ def log_user_login( }, level=logging.ERROR ) + debug_print(f"⚠️ Warning: Failed to log user login activity for user {user_id}: {str(e)}") + + +def log_group_status_change( + group_id: str, + group_name: str, + old_status: str, + new_status: str, + changed_by_user_id: str, + changed_by_email: str, + reason: Optional[str] = None +) -> None: + """ + Log group status change to activity_logs container for audit trail. + + Args: + group_id (str): The ID of the group whose status is changing + group_name (str): The name of the group + old_status (str): Previous status value + new_status (str): New status value + changed_by_user_id (str): User ID of admin who made the change + changed_by_email (str): Email of admin who made the change + reason (str, optional): Optional reason for the status change + """ + + try: + import uuid + + # Create status change activity record + status_change_activity = { + 'id': str(uuid.uuid4()), + 'activity_type': 'group_status_change', + 'timestamp': datetime.utcnow().isoformat(), + 'created_at': datetime.utcnow().isoformat(), + 'group': { + 'group_id': group_id, + 'group_name': group_name + }, + 'status_change': { + 'old_status': old_status, + 'new_status': new_status, + 'changed_at': datetime.utcnow().isoformat() + }, + 'changed_by': { + 'user_id': changed_by_user_id, + 'email': changed_by_email + }, + 'workspace_type': 'group', + 'workspace_context': { + 'group_id': group_id + } + } + + # Add reason if provided + if reason: + status_change_activity['status_change']['reason'] = reason + + # Save to activity_logs container for permanent audit trail + cosmos_activity_logs_container.create_item(body=status_change_activity) + + # Also log to Application Insights for monitoring + log_event( + message=f"Group status changed: {group_name} ({group_id}) from '{old_status}' to '{new_status}' by {changed_by_email}", + extra=status_change_activity, + level=logging.INFO + ) + + debug_print(f"✅ Group status change logged: {group_id} -> {new_status}") + + except Exception as e: + # Log error but don't break the status update flow + log_event( + message=f"Error logging group status change: {str(e)}", + extra={ + 'group_id': group_id, + 'new_status': new_status, + 'changed_by_user_id': changed_by_user_id, + 'error': str(e) + }, + level=logging.ERROR + ) + debug_print(f"⚠️ Warning: Failed to log group status change: {str(e)}") + + +def log_group_member_deleted( + removed_by_user_id: str, + removed_by_email: str, + removed_by_role: str, + member_user_id: str, + member_email: str, + member_name: str, + group_id: str, + group_name: str, + action: str, + description: Optional[str] = None +) -> None: + """ + Log group member deletion/removal transaction to activity_logs container. + This creates a permanent record when users are removed from groups. + + Args: + removed_by_user_id (str): ID of user performing the removal + removed_by_email (str): Email of user performing the removal + removed_by_role (str): Role of user performing the removal (Owner, Admin, Member) + member_user_id (str): ID of the member being removed + member_email (str): Email of the member being removed + member_name (str): Display name of the member being removed + group_id (str): ID of the group + group_name (str): Name of the group + action (str): Specific action ('member_left_group' or 'admin_removed_member') + description (str, optional): Human-readable description of the action + """ + + try: + import uuid + + # Create group member deletion activity log record + activity_record = { + 'id': str(uuid.uuid4()), + 'user_id': removed_by_user_id, # Person who performed the action (for partitioning) + 'activity_type': 'group_member_deleted', + 'timestamp': datetime.utcnow().isoformat(), + 'created_at': datetime.utcnow().isoformat(), + 'removed_by': { + 'user_id': removed_by_user_id, + 'email': removed_by_email, + 'role': removed_by_role + }, + 'removed_member': { + 'user_id': member_user_id, + 'email': member_email, + 'name': member_name + }, + 'group': { + 'group_id': group_id, + 'group_name': group_name + }, + 'description': description or f"{removed_by_role} removed member from group" + } + + # Save to activity_logs container for permanent record + cosmos_activity_logs_container.create_item(body=activity_record) + + # Also log to Application Insights for monitoring + log_event( + message=f"Group member deleted: {member_name} ({member_email}) removed from {group_name}", + extra=activity_record, + level=logging.INFO + ) + + debug_print(f"✅ Group member deletion logged to activity_logs: {member_user_id} from group {group_id}") + + except Exception as e: + # Log error but don't break the member removal flow + log_event( + message=f"Error logging group member deletion: {str(e)}", + extra={ + 'removed_by_user_id': removed_by_user_id, + 'member_user_id': member_user_id, + 'group_id': group_id, + 'error': str(e) + }, + level=logging.ERROR + ) + debug_print(f"⚠️ Warning: Failed to log group member deletion: {str(e)}") + + +def log_public_workspace_status_change( + workspace_id: str, + workspace_name: str, + old_status: str, + new_status: str, + changed_by_user_id: str, + changed_by_email: str, + reason: Optional[str] = None +) -> None: + """ + Log public workspace status change to activity_logs container for audit trail. + + Args: + workspace_id (str): The ID of the public workspace whose status is changing + workspace_name (str): The name of the public workspace + old_status (str): Previous status value + new_status (str): New status value + changed_by_user_id (str): User ID of admin who made the change + changed_by_email (str): Email of admin who made the change + reason (str, optional): Optional reason for the status change + """ + + try: + import uuid + + # Create status change activity record + status_change_activity = { + 'id': str(uuid.uuid4()), + 'activity_type': 'public_workspace_status_change', + 'timestamp': datetime.utcnow().isoformat(), + 'created_at': datetime.utcnow().isoformat(), + 'public_workspace': { + 'workspace_id': workspace_id, + 'workspace_name': workspace_name + }, + 'status_change': { + 'old_status': old_status, + 'new_status': new_status, + 'changed_at': datetime.utcnow().isoformat() + }, + 'changed_by': { + 'user_id': changed_by_user_id, + 'email': changed_by_email + }, + 'workspace_type': 'public_workspace', + 'workspace_context': { + 'public_workspace_id': workspace_id + } + } + + # Add reason if provided + if reason: + status_change_activity['status_change']['reason'] = reason + + # Save to activity_logs container for permanent audit trail + cosmos_activity_logs_container.create_item(body=status_change_activity) + + # Also log to Application Insights for monitoring + log_event( + message=f"Public workspace status changed: {workspace_name} ({workspace_id}) from '{old_status}' to '{new_status}' by {changed_by_email}", + extra=status_change_activity, + level=logging.INFO + ) + + debug_print(f"✅ Logged public workspace status change: {workspace_name} ({workspace_id}) {old_status} -> {new_status}") + + except Exception as e: + # Log error but don't fail the operation + log_event( + message=f"Error logging public workspace status change: {str(e)}", + extra={ + 'workspace_id': workspace_id, + 'old_status': old_status, + 'new_status': new_status, + 'changed_by_user_id': changed_by_user_id, + 'error': str(e) + }, + level=logging.ERROR + ) + debug_print(f"⚠️ Warning: Failed to log public workspace status change: {str(e)}") diff --git a/application/single_app/functions_approvals.py b/application/single_app/functions_approvals.py new file mode 100644 index 00000000..14b803cd --- /dev/null +++ b/application/single_app/functions_approvals.py @@ -0,0 +1,850 @@ +# functions_approvals.py + +""" +Approval workflow functions for Control Center administrative operations. +Handles approval requests for sensitive operations like ownership transfers, +group deletions, and document deletions. +""" + +import uuid +import logging +from datetime import datetime, timedelta +from typing import Optional, List, Dict, Any +from config import cosmos_approvals_container, cosmos_groups_container +from functions_appinsights import log_event +from functions_notifications import create_notification +from functions_group import find_group_by_id +from functions_debug import debug_print + +# Approval request statuses +STATUS_PENDING = "pending" +STATUS_APPROVED = "approved" +STATUS_DENIED = "denied" +STATUS_AUTO_DENIED = "auto_denied" +STATUS_EXECUTED = "executed" +STATUS_FAILED = "failed" + +# Approval request types +TYPE_TAKE_OWNERSHIP = "take_ownership" +TYPE_TRANSFER_OWNERSHIP = "transfer_ownership" +TYPE_DELETE_DOCUMENTS = "delete_documents" +TYPE_DELETE_GROUP = "delete_group" +TYPE_DELETE_USER_DOCUMENTS = "delete_user_documents" + +# TTL settings +TTL_AUTO_DENY_DAYS = 3 +TTL_AUTO_DENY_SECONDS = TTL_AUTO_DENY_DAYS * 24 * 60 * 60 # 3 days in seconds + + +def create_approval_request( + request_type: str, + group_id: str, + requester_id: str, + requester_email: str, + requester_name: str, + reason: str, + metadata: Optional[Dict[str, Any]] = None +) -> Dict[str, Any]: + """ + Create a new approval request for a sensitive Control Center operation. + + Args: + request_type: Type of request (take_ownership, transfer_ownership, delete_documents, delete_group, delete_user_documents) + group_id: ID of the group being affected (or user_id for user-related requests) + requester_id: User ID of the person requesting the action + requester_email: Email of the requester + requester_name: Display name of the requester + reason: Explanation/justification for the request + metadata: Additional request-specific data (e.g., new_owner_id for transfers, user_name for user documents) + + Returns: + Created approval request document + """ + try: + # For user document deletion requests, use metadata for display info + # Initialize group variable for notifications (may be None for non-group operations) + group = None + + if request_type == TYPE_DELETE_USER_DOCUMENTS: + # For user document deletions, group_id is actually the user_id (partition key) + group_name = metadata.get('user_name', 'Unknown User') + group_owner = {} + elif metadata and metadata.get('entity_type') == 'workspace': + # For public workspace operations + from config import cosmos_public_workspaces_container + try: + workspace = cosmos_public_workspaces_container.read_item(item=group_id, partition_key=group_id) + group_name = workspace.get('name', 'Unknown Workspace') + workspace_owner = workspace.get('owner', {}) + if isinstance(workspace_owner, dict): + group_owner = { + 'id': workspace_owner.get('userId'), + 'email': workspace_owner.get('email'), + 'displayName': workspace_owner.get('displayName') + } + else: + # Old format where owner is just a string ID + group_owner = {'id': workspace_owner, 'email': 'unknown', 'displayName': 'unknown'} + + # Normalize workspace owner structure to match group owner structure for notifications + # Workspace uses 'userId' but notification function expects 'id' + workspace['owner'] = group_owner + + # Set group to workspace for notification purposes + group = workspace + except: + raise ValueError(f"Workspace {group_id} not found") + else: + # Get group details for group-based approvals + group = find_group_by_id(group_id) + if not group: + raise ValueError(f"Group {group_id} not found") + + group_name = group.get('name', 'Unknown Group') + group_owner = group.get('owner', {}) + + # Create approval request document + approval_id = str(uuid.uuid4()) + now = datetime.utcnow() + + approval_request = { + 'id': approval_id, + 'group_id': group_id, # Partition key + 'request_type': request_type, + 'status': STATUS_PENDING, + 'group_name': group_name, + 'requester_id': requester_id, + 'requester_email': requester_email, + 'requester_name': requester_name, + 'reason': reason, + 'group_owner_id': group_owner.get('id'), + 'group_owner_email': group_owner.get('email'), + 'group_owner_name': group_owner.get('displayName', group_owner.get('email')), + 'created_at': now.isoformat(), + 'expires_at': (now + timedelta(days=TTL_AUTO_DENY_DAYS)).isoformat(), + 'ttl': TTL_AUTO_DENY_SECONDS, # Auto-deny after 3 days + 'approved_by_id': None, + 'approved_by_email': None, + 'approved_by_name': None, + 'approved_at': None, + 'approval_comment': None, + 'executed_at': None, + 'execution_result': None, + 'metadata': metadata or {} + } + + # Save to Cosmos DB + cosmos_approvals_container.create_item(body=approval_request) + + # Log event + log_event("[Approvals] Created approval request", { + 'approval_id': approval_id, + 'request_type': request_type, + 'group_id': group_id, + 'group_name': group_name, + 'requester': requester_email, + 'reason': reason + }) + debug_print(f"Created approval request: {approval_request}") + + # Create notifications for eligible approvers + _create_approval_notifications(approval_request, group if request_type != TYPE_DELETE_USER_DOCUMENTS else None) + + return approval_request + + except Exception as e: + log_event("[Approvals] Error creating approval request", { + 'error': str(e), + 'request_type': request_type, + 'group_id': group_id, + 'requester': requester_email + }, level=logging.ERROR) + debug_print(f"Error creating approval request: {e}") + raise + + +def get_pending_approvals( + user_id: str, + user_roles: List[str], + page: int = 1, + per_page: int = 20, + include_completed: bool = False, + request_type_filter: Optional[str] = None, + status_filter: str = 'pending' +) -> Dict[str, Any]: + """ + Get approval requests that the user is eligible to approve. + + Args: + user_id: Current user ID + user_roles: List of roles the user has (e.g., ['admin', 'ControlCenterAdmin']) + page: Page number for pagination + per_page: Items per page + include_completed: Include approved/denied/executed requests + request_type_filter: Filter by request type + status_filter: Filter by specific status ('pending', 'approved', 'denied', 'executed', 'all') + + Returns: + Dictionary with approvals list, total count, and pagination info + """ + try: + # Build query based on filters + query_parts = ["SELECT * FROM c WHERE 1=1"] + parameters = [] + + # Status filter + if status_filter != 'all': + # If specific status requested (pending, approved, denied, executed) + query_parts.append("AND c.status = @status") + parameters.append({"name": "@status", "value": status_filter}) + # else: 'all' means no status filter + + # Request type filter + if request_type_filter: + query_parts.append("AND c.request_type = @request_type") + parameters.append({"name": "@request_type", "value": request_type_filter}) + + # Order by created date descending + query_parts.append("ORDER BY c.created_at DESC") + + query = " ".join(query_parts) + + debug_print(f"📋 [GET_APPROVALS] Query: {query}") + debug_print(f"📋 [GET_APPROVALS] Parameters: {parameters}") + debug_print(f"📋 [GET_APPROVALS] status_filter: {status_filter}") + + # Execute cross-partition query (we need to see all groups) + items = list(cosmos_approvals_container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + debug_print(f"📋 [GET_APPROVALS] Found {len(items)} total items from query") + + # Filter by user eligibility + # For pending requests: check if user can approve + # For completed requests: check if user has visibility (was involved or is admin/owner) + eligible_approvals = [] + for approval in items: + if status_filter == 'pending': + # For pending requests, check if user can approve + if _can_user_approve(approval, user_id, user_roles): + eligible_approvals.append(approval) + else: + # For completed requests, check if user has visibility + if _can_user_view(approval, user_id, user_roles): + eligible_approvals.append(approval) + + debug_print(f"📋 [GET_APPROVALS] After eligibility filter: {len(eligible_approvals)} approvals") + + # Paginate + total_count = len(eligible_approvals) + start_idx = (page - 1) * per_page + end_idx = start_idx + per_page + paginated_approvals = eligible_approvals[start_idx:end_idx] + + debug_print(f"User {user_id} fetched pending approvals: page {page}, per_page {per_page}, total {total_count}") + + return { + 'approvals': paginated_approvals, + 'total': total_count, + 'page': page, + 'per_page': per_page, + 'total_pages': (total_count + per_page - 1) // per_page + } + + except Exception as e: + log_event("[Approvals] Error fetching pending approvals", { + 'error': str(e), + 'user_id': user_id, + 'user_roles': user_roles + }) + debug_print(f"Error fetching pending approvals: {e}") + raise + + +def approve_request( + approval_id: str, + group_id: str, + approver_id: str, + approver_email: str, + approver_name: str, + comment: Optional[str] = None +) -> Dict[str, Any]: + """ + Approve an approval request. + + Args: + approval_id: ID of the approval request + group_id: Group ID (partition key) + approver_id: User ID of approver + approver_email: Email of approver + approver_name: Display name of approver + comment: Optional comment from approver + + Returns: + Updated approval request document + """ + try: + # Get the approval request + approval = cosmos_approvals_container.read_item( + item=approval_id, + partition_key=group_id + ) + + # Validate status + if approval['status'] != STATUS_PENDING: + debug_print(f"Cannot approve request with status: {approval['status']}") + raise ValueError(f"Cannot approve request with status: {approval['status']}") + + # Update approval status + approval['status'] = STATUS_APPROVED + approval['approved_by_id'] = approver_id + approval['approved_by_email'] = approver_email + approval['approved_by_name'] = approver_name + approval['approved_at'] = datetime.utcnow().isoformat() + approval['approval_comment'] = comment + approval['ttl'] = -1 # Remove TTL so it doesn't auto-delete + + # Save updated approval + cosmos_approvals_container.upsert_item(approval) + + # Log event + log_event("[Approvals] Request approved", { + 'approval_id': approval_id, + 'request_type': approval['request_type'], + 'group_id': group_id, + 'approver': approver_email, + 'comment': comment + }) + debug_print(f"Approved request: {approval}") + + # Create notification for requester + create_notification( + user_id=approval['requester_id'], + notification_type='approval_request_approved', + title=f"Request Approved: {_format_request_type(approval['request_type'])}", + message=f"Your request for {approval['group_name']} has been approved by {approver_name}.", + link_url='/approvals', + link_context={ + 'approval_id': approval_id + }, + metadata={ + 'approval_id': approval_id, + 'request_type': approval['request_type'], + 'group_id': group_id, + 'approver_email': approver_email, + 'comment': comment + } + ) + + return approval + + except Exception as e: + log_event("[Approvals] Error approving request", { + 'error': str(e), + 'approval_id': approval_id, + 'group_id': group_id, + 'approver': approver_email + }) + debug_print(f"Error approving request: {e}") + raise + + +def deny_request( + approval_id: str, + group_id: str, + denier_id: str, + denier_email: str, + denier_name: str, + comment: str, + auto_denied: bool = False +) -> Dict[str, Any]: + """ + Deny an approval request. + + Args: + approval_id: ID of the approval request + group_id: Group ID (partition key) + denier_id: User ID of person denying (or 'system' for auto-deny) + denier_email: Email of denier + denier_name: Display name of denier + comment: Reason for denial + auto_denied: Whether this is an automatic denial + + Returns: + Updated approval request document + """ + try: + # Get the approval request + approval = cosmos_approvals_container.read_item( + item=approval_id, + partition_key=group_id + ) + + # Validate status (allow denying pending requests) + if approval['status'] not in [STATUS_PENDING]: + debug_print(f"Cannot deny request with status: {approval['status']}") + raise ValueError(f"Cannot deny request with status: {approval['status']}") + + # Update approval status + approval['status'] = STATUS_AUTO_DENIED if auto_denied else STATUS_DENIED + approval['approved_by_id'] = denier_id + approval['approved_by_email'] = denier_email + approval['approved_by_name'] = denier_name + approval['approved_at'] = datetime.utcnow().isoformat() + approval['approval_comment'] = comment + approval['ttl'] = -1 # Remove TTL + + # Save updated approval + cosmos_approvals_container.upsert_item(approval) + + # Log event + log_event("[Approvals] Request denied", { + 'approval_id': approval_id, + 'request_type': approval['request_type'], + 'group_id': group_id, + 'denier': denier_email, + 'auto_denied': auto_denied, + 'comment': comment + }) + debug_print(f"Request denied: {approval_id}") + + # Create notification for requester (only if not auto-denied) + if not auto_denied: + create_notification( + user_id=approval['requester_id'], + notification_type='approval_request_denied', + title=f"Request Denied: {_format_request_type(approval['request_type'])}", + message=f"Your request for {approval['group_name']} was denied by {denier_name}.", + link_url='/approvals', + link_context={ + 'approval_id': approval_id + }, + metadata={ + 'approval_id': approval_id, + 'request_type': approval['request_type'], + 'group_id': group_id, + 'denier_email': denier_email, + 'comment': comment + } + ) + + return approval + + except Exception as e: + log_event("[Approvals] Error denying request", { + 'error': str(e), + 'approval_id': approval_id, + 'group_id': group_id, + 'denier_id': denier_id, + 'comment': comment, + 'auto_denied': auto_denied + }) + debug_print(f"Error denying request: {e}") + raise + + +def mark_approval_executed( + approval_id: str, + group_id: str, + success: bool, + result_message: str +) -> Dict[str, Any]: + """ + Mark an approved request as executed (or failed). + + Args: + approval_id: ID of the approval request + group_id: Group ID (partition key) + success: Whether execution was successful + result_message: Result message or error + + Returns: + Updated approval request document + """ + try: + # Get the approval request + approval = cosmos_approvals_container.read_item( + item=approval_id, + partition_key=group_id + ) + + # Update execution status + approval['status'] = STATUS_EXECUTED if success else STATUS_FAILED + approval['executed_at'] = datetime.utcnow().isoformat() + approval['execution_result'] = result_message + + # Save updated approval + cosmos_approvals_container.upsert_item(approval) + + # Log event + log_event("[Approvals] Request executed", { + 'approval_id': approval_id, + 'request_type': approval['request_type'], + 'group_id': group_id, + 'success': success, + 'result': result_message + }) + debug_print(f"Marked approval as executed: {approval_id}, success: {success}") + + return approval + + except Exception as e: + log_event("[Approvals] Error marking request as executed", { + 'error': str(e), + 'approval_id': approval_id, + 'group_id': group_id, + 'success': success, + 'result': result_message + }) + debug_print(f"Error marking approval as executed: {e}") + raise + + +def get_approval_by_id(approval_id: str, group_id: str) -> Optional[Dict[str, Any]]: + """ + Get a specific approval request by ID. + + Args: + approval_id: ID of the approval request + group_id: Group ID (partition key) + + Returns: + Approval request document or None if not found + """ + try: + return cosmos_approvals_container.read_item( + item=approval_id, + partition_key=group_id + ) + except Exception: + log_event("[Approvals] Approval not found", { + 'approval_id': approval_id, + 'group_id': group_id + }) + debug_print(f"Approval not found: {approval_id}") + return None + + +def auto_deny_expired_approvals() -> int: + """ + Auto-deny approval requests that have expired (older than 3 days). + This function should be called by a scheduled job. + + Returns: + Number of approvals auto-denied + """ + try: + # Query for pending approvals + query = "SELECT * FROM c WHERE c.status = @status" + parameters = [{"name": "@status", "value": STATUS_PENDING}] + + pending_approvals = list(cosmos_approvals_container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + now = datetime.utcnow() + denied_count = 0 + + for approval in pending_approvals: + expires_at = datetime.fromisoformat(approval['expires_at']) + + # Check if expired + if now >= expires_at: + try: + deny_request( + approval_id=approval['id'], + group_id=approval['group_id'], + denier_id='system', + denier_email='system@simplechat', + denier_name='System Auto-Deny', + comment='Request automatically denied after 3 days without approval.', + auto_denied=True + ) + denied_count += 1 + except Exception as e: + log_event("[Approvals] Error auto-denying expired approval", { + 'approval_id': approval['id'], + 'error': str(e) + }) + debug_print(f"Error auto-denying approval {approval['id']}: {e}") + + if denied_count > 0: + log_event("[Approvals] Auto-denied expired approvals", { + 'denied_count': denied_count + }) + debug_print(f"Auto-denied {denied_count} expired approvals") + + return denied_count + + except Exception as e: + log_event("[Approvals] Error in auto_deny_expired_approvals", { + 'error': str(e) + }) + debug_print(f"Error in auto_deny_expired_approvals: {e}") + return 0 + + +def _can_user_view( + approval: Dict[str, Any], + user_id: str, + user_roles: List[str] +) -> bool: + """ + Check if a user can view a specific approval request (including completed ones). + + Visibility rules (more permissive than approval rights): + - User is the requester, OR + - User is the approver, OR + - User is the group owner, OR + - User is the personal workspace owner (for user document operations), OR + - User has 'ControlCenterAdmin' role, OR + - User has 'Admin' role + + Args: + approval: Approval request document + user_id: User ID to check + user_roles: List of roles the user has + + Returns: + True if user can view, False otherwise + """ + # Check if user was involved in the request + is_requester = approval.get('requester_id') == user_id + is_approver = approval.get('approved_by_id') == user_id + + # Check if user is the group owner + is_group_owner = approval.get('group_owner_id') == user_id + + # Check if user is the personal workspace owner (for user document deletion) + is_personal_workspace_owner = False + if approval.get('request_type') == TYPE_DELETE_USER_DOCUMENTS: + target_user_id = approval.get('metadata', {}).get('user_id') + is_personal_workspace_owner = target_user_id == user_id + + # Check if user has admin roles + has_control_center_admin = 'ControlCenterAdmin' in user_roles + has_admin = 'Admin' in user_roles or 'admin' in user_roles + + # User can view if they meet any of these criteria + return (is_requester or is_approver or is_group_owner or + is_personal_workspace_owner or has_control_center_admin or has_admin) + + +def _can_user_approve( + approval: Dict[str, Any], + user_id: str, + user_roles: List[str] +) -> bool: + """ + Check if a user is eligible to approve a specific request. + + Eligibility rules: + - User must be the group owner (for group operations), OR + - User must be the personal workspace owner (for user document operations), OR + - User must have 'ControlCenterAdmin' role, OR + - User must have 'Admin' role + - User cannot be the requester (unless they're the only eligible approver) + + Args: + approval: Approval request document + user_id: User ID to check + user_roles: List of roles the user has + + Returns: + True if user can approve, False otherwise + """ + # Check if user is the group owner (for group-based approvals) + is_group_owner = approval.get('group_owner_id') == user_id + + # Check if user is the personal workspace owner (for user document deletion) + is_personal_workspace_owner = False + if approval.get('request_type') == TYPE_DELETE_USER_DOCUMENTS: + # For user document deletion, check if user owns the documents + target_user_id = approval.get('metadata', {}).get('user_id') + is_personal_workspace_owner = target_user_id == user_id + + # Check if user has admin roles (check both capitalized and lowercase) + has_control_center_admin = 'ControlCenterAdmin' in user_roles + has_admin = 'Admin' in user_roles or 'admin' in user_roles + + # User must have at least one eligibility criterion + if not (is_group_owner or is_personal_workspace_owner or has_control_center_admin or has_admin): + return False + + # Special case: If user is the requester, they can still approve if they're the only eligible approver + # This handles the case where there's only one admin in the system + if approval.get('requester_id') == user_id: + # Allow same-user approval (with documentation through the approval system) + return True + + return True + + +def _create_approval_notifications( + approval: Dict[str, Any], + group: Optional[Dict[str, Any]] +) -> None: + """ + Create notifications for all users who can approve the request using assignment-based targeting. + Notifications target users by roles (Admin, ControlCenterAdmin) and/or ownership IDs. + + For user management (delete_user_documents): + - Notifies: Control Center Admins, Admins, and the affected user + For group management (transfer_ownership, delete_documents, delete_group, take_ownership): + - Notifies: Control Center Admins, Admins, and the group owner + + Args: + approval: Approval request document + group: Group document (None for user-related approvals) + """ + try: + log_event("[Approvals] Creating assignment-based approval notifications", { + 'approval_id': approval['id'], + 'group_id': approval['group_id'], + 'request_type': approval['request_type'] + }) + debug_print(f"Creating assignment-based approval notifications for approval: {approval['id']}") + + # Build assignment criteria based on request type + assignment = { + 'roles': ['Admin', 'ControlCenterAdmin'] # Always include admin roles + } + + # Add ownership-based targeting + if approval['request_type'] == TYPE_DELETE_USER_DOCUMENTS: + # For user document deletion: notify the user whose documents are being deleted + user_id = approval.get('metadata', {}).get('user_id') + if user_id: + assignment['personal_workspace_owner_id'] = user_id + log_event("[Approvals] Targeting user for document deletion", { + 'user_id': user_id, + 'approval_id': approval['id'] + }) + debug_print(f"Added personal workspace owner {user_id} to notification assignment") + else: + # For group operations: notify the group owner + if group: + group_owner_id = group.get('owner', {}).get('id') + if group_owner_id: + assignment['group_owner_id'] = group_owner_id + log_event("[Approvals] Targeting group owner", { + 'group_owner_id': group_owner_id, + 'approval_id': approval['id'] + }) + debug_print(f"Added group owner {group_owner_id} to notification assignment") + else: + log_event("[Approvals] No group provided for group-based approval", { + 'approval_id': approval['id'], + 'request_type': approval['request_type'] + }, level=logging.WARNING) + + log_event("[Approvals] Notification assignment", { + 'approval_id': approval['id'], + 'assignment': assignment + }) + debug_print(f"Notification assignment for approval {approval['id']}: {assignment}") + + # For transfer ownership requests, also notify the new owner (informational) + if approval['request_type'] == TYPE_TRANSFER_OWNERSHIP: + new_owner_id = approval.get('metadata', {}).get('new_owner_id') + if new_owner_id and new_owner_id != approval['requester_id']: + # Create informational notification for new owner + try: + log_event("[Approvals] Notifying new owner", { + 'user_id': new_owner_id, + 'approval_id': approval['id'] + }) + debug_print(f"Notifying new owner {new_owner_id} about transfer request") + create_notification( + group_id=approval['group_id'], + notification_type='approval_request_pending', + title=f"Ownership Transfer Pending", + message=f"{approval['requester_name']} has requested to transfer ownership of {approval['group_name']} to you. Awaiting approval.", + link_url='/approvals', + link_context={ + 'approval_id': approval['id'] + }, + metadata={ + 'approval_id': approval['id'], + 'request_type': approval['request_type'], + 'group_id': approval['group_id'], + 'requester_email': approval['requester_email'] + }, + assignment={ + 'personal_workspace_owner_id': new_owner_id # Only new owner sees this + } + ) + debug_print(f"Successfully notified new owner {new_owner_id}") + except Exception as notify_error: + log_event("[Approvals] Error notifying new owner", { + 'error': str(notify_error), + 'user_id': new_owner_id, + 'approval_id': approval['id'] + }) + debug_print(f"Error notifying new owner {new_owner_id}: {str(notify_error)}") + + # Create single notification with assignment - visible to all eligible approvers + try: + log_event("[Approvals] Creating approval notification with assignment", { + 'approval_id': approval['id'], + 'assignment': assignment + }) + debug_print(f"Creating approval notification with assignment for approval {approval['id']}") + create_notification( + group_id=approval['group_id'], + notification_type='approval_request_pending', + title=f"Approval Required: {_format_request_type(approval['request_type'])}", + message=f"{approval['requester_name']} requests {_format_request_type(approval['request_type'])} for {approval['group_name']}. Reason: {approval.get('reason', 'Not provided')}", + link_url='/approvals', + link_context={ + 'approval_id': approval['id'] + }, + metadata={ + 'approval_id': approval['id'], + 'request_type': approval['request_type'], + 'group_id': approval['group_id'], + 'requester_email': approval['requester_email'], + 'reason': approval['reason'] + }, + assignment=assignment + ) + debug_print(f"Successfully created approval notification with assignment for approval {approval['id']}") + except Exception as notify_error: + log_event("[Approvals] Error creating approval notification", { + 'error': str(notify_error), + 'approval_id': approval['id'] + }) + debug_print(f"Error creating approval notification for approval {approval['id']}: {str(notify_error)}") + + except Exception as e: + log_event("[Approvals] Error notifying users about approval request", { + 'error': str(e), + 'approval_id': approval['id'] + }) + debug_print(f"Error notifying users about approval request {approval['id']}: {str(e)}") + # Don't raise - notifications are non-critical + + +def _format_request_type(request_type: str) -> str: + """ + Format request type for display. + + Args: + request_type: Request type constant + + Returns: + Human-readable request type string + """ + type_labels = { + TYPE_TAKE_OWNERSHIP: "Take Ownership", + TYPE_TRANSFER_OWNERSHIP: "Transfer Ownership", + TYPE_DELETE_DOCUMENTS: "Delete All Documents", + TYPE_DELETE_GROUP: "Delete Group", + TYPE_DELETE_USER_DOCUMENTS: "Delete All User Documents" + } + return type_labels.get(request_type, request_type) diff --git a/application/single_app/functions_authentication.py b/application/single_app/functions_authentication.py index a6716029..f3fd1ce9 100644 --- a/application/single_app/functions_authentication.py +++ b/application/single_app/functions_authentication.py @@ -2,6 +2,7 @@ from config import * from functions_settings import * +from functions_debug import debug_print # Default redirect path for OAuth consent flow (must match your Azure AD app registration) REDIRECT_PATH = getattr(globals(), 'REDIRECT_PATH', '/getAToken') @@ -37,7 +38,7 @@ def _load_cache(): cache.deserialize(session["token_cache"]) except Exception as e: # Handle potential corruption or format issues gracefully - print(f"Warning: Could not deserialize token cache: {e}. Starting fresh.") + debug_print(f"Warning: Could not deserialize token cache: {e}. Starting fresh.") session.pop("token_cache", None) # Clear corrupted cache return cache @@ -47,7 +48,7 @@ def _save_cache(cache): try: session["token_cache"] = cache.serialize() except Exception as e: - print(f"Error: Could not serialize token cache: {e}") + debug_print(f"Error: Could not serialize token cache: {e}") # Decide how to handle this, maybe clear cache or log extensively # session.pop("token_cache", None) # Option: Clear on serialization failure @@ -82,7 +83,7 @@ def get_valid_access_token(scopes=None): Returns the access token string or None if refresh failed or user not logged in. """ if "user" not in session: - print("get_valid_access_token: No user in session.") + debug_print("get_valid_access_token: No user in session.") return None # User not logged in required_scopes = scopes or SCOPE # Use default SCOPE if none provided @@ -105,37 +106,40 @@ def get_valid_access_token(scopes=None): break if not account: account = accounts[0] # Fallback to first account if no perfect match - print(f"Warning: Using first account found ({account.get('username')}) as home_account_id match failed.") + debug_print(f"Warning: Using first account found ({account.get('username')}) as home_account_id match failed.") if account: # Try to get token silently (checks cache, then uses refresh token) result = msal_app.acquire_token_silent(required_scopes, account=account) _save_cache(msal_app.token_cache) # Save cache state AFTER attempt + debug_print(f"User account name: {account.get('username')}") + debug_print(f"All roles assigned to user: {user_info.get('roles')}") + if result and "access_token" in result: # Optional: Check expiry if you want fine-grained control, but MSAL usually handles it # expires_in = result.get('expires_in', 0) # if expires_in > 60: # Check if token is valid for at least 60 seconds - # print("get_valid_access_token: Token acquired silently.") + # debug_print("get_valid_access_token: Token acquired silently.") # return result['access_token'] # else: - # print("get_valid_access_token: Silent token expired or about to expire.") + # debug_print("get_valid_access_token: Silent token expired or about to expire.") # # MSAL should have refreshed, but if not, fall through - print(f"get_valid_access_token: Token acquired silently for scopes: {required_scopes}") + debug_print(f"get_valid_access_token: Token acquired silently for scopes: {required_scopes}") return result['access_token'] else: # acquire_token_silent failed (e.g., refresh token expired, needs interaction) - print("get_valid_access_token: acquire_token_silent failed. Needs re-authentication.") + debug_print("get_valid_access_token: acquire_token_silent failed. Needs re-authentication.") # Log the specific error if available in result if result and ('error' in result or 'error_description' in result): - print(f"MSAL Error: {result.get('error')}, Description: {result.get('error_description')}") + debug_print(f"MSAL Error: {result.get('error')}, Description: {result.get('error_description')}") # Optionally clear session or specific keys if refresh consistently fails # session.pop("token_cache", None) # session.pop("user", None) return None # Indicate failure to get a valid token else: - print("get_valid_access_token: No matching account found in MSAL cache.") + debug_print("get_valid_access_token: No matching account found in MSAL cache.") # This might happen if the cache was cleared or the user logged in differently return None # Cannot acquire token without an account context @@ -146,7 +150,7 @@ def get_valid_access_token_for_plugins(scopes=None): Returns the access token string or None if refresh failed or user not logged in. """ if "user" not in session: - print("get_valid_access_token: No user in session.") + debug_print("get_valid_access_token: No user in session.") return { "error": "not_logged_in", "message": "User is not logged in.", @@ -174,10 +178,10 @@ def get_valid_access_token_for_plugins(scopes=None): break if not account: account = accounts[0] # Fallback to first account if no perfect match - print(f"Warning: Using first account found ({account.get('username')}) as home_account_id match failed.") + debug_print(f"Warning: Using first account found ({account.get('username')}) as home_account_id match failed.") if not account: - print("get_valid_access_token: No matching account found in MSAL cache.") + debug_print("get_valid_access_token: No matching account found in MSAL cache.") return { "error": "no_account", "message": "No matching account found in MSAL cache.", @@ -189,13 +193,13 @@ def get_valid_access_token_for_plugins(scopes=None): _save_cache(msal_app.token_cache) if result and "access_token" in result: - print(f"get_valid_access_token: Token acquired silently for scopes: {required_scopes}") + debug_print(f"get_valid_access_token: Token acquired silently for scopes: {required_scopes}") return {"access_token": result['access_token']} # If we reach here, it means silent acquisition failed - print("get_valid_access_token: acquire_token_silent failed. Needs re-authentication or received invalid grants.") + debug_print("get_valid_access_token: acquire_token_silent failed. Needs re-authentication or received invalid grants.") if result is None: # Assume invalid grants or no token - print("result is None: get_valid_access_token: Consent required.") + debug_print("result is None: get_valid_access_token: Consent required.") host_url = request.host_url.rstrip('/') # Only enforce https if not localhost or 127.0.0.1 if not (host_url.startswith('http://localhost') or host_url.startswith('http://127.0.0.1')): @@ -216,7 +220,7 @@ def get_valid_access_token_for_plugins(scopes=None): error_code = result.get('error') if result else None error_desc = result.get('error_description') if result else None - print(f"MSAL Error: {error_code}, Description: {error_desc}") + debug_print(f"MSAL Error: {error_code}, Description: {error_desc}") if error_code == "invalid_grant" and error_desc and ("AADSTS65001" in error_desc or "consent_required" in error_desc): host_url = request.host_url.rstrip('/') @@ -265,7 +269,7 @@ def get_video_indexer_account_token(settings, video_id=None): def get_video_indexer_managed_identity_token(settings, video_id=None): """ - For ARM-based VideoIndexer accounts: + For ARM-based VideoIndexer accounts using managed identity: 1) Acquire an ARM token with DefaultAzureCredential 2) POST to the ARM generateAccessToken endpoint 3) Return the account-level accessToken @@ -274,6 +278,7 @@ def get_video_indexer_managed_identity_token(settings, video_id=None): debug_print(f"[VIDEO INDEXER AUTH] Starting token acquisition for video_id: {video_id}") debug_print(f"[VIDEO INDEXER AUTH] Azure environment: {AZURE_ENVIRONMENT}") + debug_print(f"[VIDEO INDEXER AUTH] Using managed identity authentication") # 1) ARM token if AZURE_ENVIRONMENT == "usgovernment": @@ -290,7 +295,7 @@ def get_video_indexer_managed_identity_token(settings, video_id=None): debug_print(f"[VIDEO INDEXER AUTH] DefaultAzureCredential initialized successfully") arm_token = credential.get_token(arm_scope).token debug_print(f"[VIDEO INDEXER AUTH] ARM token acquired successfully (length: {len(arm_token) if arm_token else 0})") - print("[VIDEO] ARM token acquired", flush=True) + debug_print("[VIDEO] ARM token acquired", flush=True) except Exception as e: debug_print(f"[VIDEO INDEXER AUTH] ERROR acquiring ARM token: {str(e)}") raise @@ -357,7 +362,7 @@ def get_video_indexer_managed_identity_token(settings, video_id=None): raise ValueError("No accessToken found in ARM API response") debug_print(f"[VIDEO INDEXER AUTH] Account token acquired successfully (length: {len(ai)})") - print(f"[VIDEO] Account token acquired (len={len(ai)})", flush=True) + debug_print(f"[VIDEO] Account token acquired (len={len(ai)})", flush=True) return ai except requests.exceptions.RequestException as e: debug_print(f"[VIDEO INDEXER AUTH] ERROR in ARM API request: {str(e)}") @@ -388,7 +393,7 @@ def get_microsoft_entra_jwks(): jwks_response = requests.get(jwks_uri).json() JWKS_CACHE = {key['kid']: key for key in jwks_response['keys']} except requests.exceptions.RequestException as e: - print(f"Error fetching JWKS: {e}") + debug_print(f"Error fetching JWKS: {e}") return None return JWKS_CACHE @@ -443,7 +448,7 @@ def accesstoken_required(f): @wraps(f) def decorated_function(*args, **kwargs): - print("accesstoken_required") + debug_print("accesstoken_required") auth_header = request.headers.get('Authorization') if not auth_header: @@ -463,7 +468,7 @@ def decorated_function(*args, **kwargs): if not roles or "ExternalApi" not in roles: return jsonify({"message": "Forbidden: ExternalApi role required"}), 403 - print("User is valid") + debug_print("User is valid") # You can now access claims from `data`, e.g., data['sub'], data['name'], data['roles'] #kwargs['user_claims'] = data # Pass claims to the decorated function # NOT NEEDED FOR NOW @@ -480,10 +485,10 @@ def decorated_function(*args, **kwargs): ) or request.path.startswith('/api/') if is_api_request: - print(f"API request to {request.path} blocked (401 Unauthorized). No valid session.") + debug_print(f"API request to {request.path} blocked (401 Unauthorized). No valid session.") return jsonify({"error": "Unauthorized", "message": "Authentication required"}), 401 else: - print(f"Browser request to {request.path} redirected ta login. No valid session.") + debug_print(f"Browser request to {request.path} redirected ta login. No valid session.") # Get settings from database, with environment variable fallback from functions_settings import get_settings settings = get_settings() @@ -547,7 +552,7 @@ def check_user_access_status(user_id): return True, None # Default to allow if status is unknown except Exception as e: - print(f"Error checking user access status: {e}") + debug_print(f"Error checking user access status: {e}") return True, None # Default to allow on error to prevent lockouts def user_required(f): @@ -635,7 +640,7 @@ def decorated_function(*args, **kwargs): else: return f"File Upload Denied: {reason}", 403 except Exception as e: - print(f"Error checking file upload permissions: {e}") + debug_print(f"Error checking file upload permissions: {e}") # Default to allow on error to prevent breaking functionality return f(*args, **kwargs) @@ -660,14 +665,30 @@ def decorated_function(*args, **kwargs): settings = get_settings() require_member_of_feedback_admin = settings.get("require_member_of_feedback_admin", False) + has_feedback_admin_role = 'roles' in user and 'FeedbackAdmin' in user['roles'] + has_admin_role = 'roles' in user and 'Admin' in user['roles'] + + # If requirement is enabled, only FeedbackAdmin role grants access if require_member_of_feedback_admin: - if 'roles' not in user or 'FeedbackAdmin' not in user['roles']: - is_api_request = (request.accept_mimetypes.accept_json and not request.accept_mimetypes.accept_html) or request.path.startswith('/api/') - if is_api_request: - return jsonify({"error": "Forbidden", "message": "Insufficient permissions (FeedbackAdmin role required)"}), 403 - else: - return "Forbidden: FeedbackAdmin role required", 403 - return f(*args, **kwargs) + if has_feedback_admin_role: + return f(*args, **kwargs) + else: + is_api_request = (request.accept_mimetypes.accept_json and not request.accept_mimetypes.accept_html) or request.path.startswith('/api/') + if is_api_request: + return jsonify({"error": "Forbidden", "message": "Insufficient permissions (FeedbackAdmin role required)"}), 403 + else: + return "Forbidden: FeedbackAdmin role required", 403 + + # If requirement is not enabled, only regular admins can access + if has_admin_role: + return f(*args, **kwargs) + + # No access if neither condition is met + is_api_request = (request.accept_mimetypes.accept_json and not request.accept_mimetypes.accept_html) or request.path.startswith('/api/') + if is_api_request: + return jsonify({"error": "Forbidden", "message": "Insufficient permissions"}), 403 + else: + return "Forbidden", 403 return decorated_function def safety_violation_admin_required(f): @@ -677,32 +698,85 @@ def decorated_function(*args, **kwargs): settings = get_settings() require_member_of_safety_violation_admin = settings.get("require_member_of_safety_violation_admin", False) + has_safety_admin_role = 'roles' in user and 'SafetyViolationAdmin' in user['roles'] + has_admin_role = 'roles' in user and 'Admin' in user['roles'] + + # If requirement is enabled, only SafetyViolationAdmin role grants access if require_member_of_safety_violation_admin: - if 'roles' not in user or 'SafetyViolationAdmin' not in user['roles']: + if has_safety_admin_role: + return f(*args, **kwargs) + else: is_api_request = (request.accept_mimetypes.accept_json and not request.accept_mimetypes.accept_html) or request.path.startswith('/api/') if is_api_request: return jsonify({"error": "Forbidden", "message": "Insufficient permissions (SafetyViolationAdmin role required)"}), 403 else: return "Forbidden: SafetyViolationAdmin role required", 403 - return f(*args, **kwargs) + + # If requirement is not enabled, only regular admins can access + if has_admin_role: + return f(*args, **kwargs) + + # No access if neither condition is met + is_api_request = (request.accept_mimetypes.accept_json and not request.accept_mimetypes.accept_html) or request.path.startswith('/api/') + if is_api_request: + return jsonify({"error": "Forbidden", "message": "Insufficient permissions"}), 403 + else: + return "Forbidden", 403 return decorated_function -def control_center_admin_required(f): - @wraps(f) - def decorated_function(*args, **kwargs): - user = session.get('user', {}) - settings = get_settings() - require_member_of_control_center_admin = settings.get("require_member_of_control_center_admin", False) - - if require_member_of_control_center_admin: - if 'roles' not in user or 'ControlCenterAdmin' not in user['roles']: +def control_center_required(access_level='admin'): + """ + Unified Control Center access control decorator. + + Args: + access_level: 'admin' for full admin access, 'dashboard' for dashboard-only access + + Access logic: + - ControlCenterAdmin role → Full access to everything (admin + dashboard) + - ControlCenterDashboardReader role → Dashboard access only + - Regular admins → Access when role requirements are disabled (default) + """ + def decorator(f): + @wraps(f) + def decorated_function(*args, **kwargs): + user = session.get('user', {}) + settings = get_settings() + require_member_of_control_center_admin = settings.get("require_member_of_control_center_admin", False) + require_member_of_control_center_dashboard_reader = settings.get("require_member_of_control_center_dashboard_reader", False) + + has_admin_role = 'roles' in user and 'ControlCenterAdmin' in user['roles'] + has_dashboard_reader_role = 'roles' in user and 'ControlCenterDashboardReader' in user['roles'] + + # ControlCenterAdmin always has full access + if has_admin_role: + return f(*args, **kwargs) + + # For dashboard access, check if DashboardReader role grants access + if access_level == 'dashboard': + if require_member_of_control_center_dashboard_reader and has_dashboard_reader_role: + return f(*args, **kwargs) + + # Check if role requirements are enforced + if require_member_of_control_center_admin: + # Admin role required but user doesn't have it is_api_request = (request.accept_mimetypes.accept_json and not request.accept_mimetypes.accept_html) or request.path.startswith('/api/') if is_api_request: return jsonify({"error": "Forbidden", "message": "Insufficient permissions (ControlCenterAdmin role required)"}), 403 else: return "Forbidden: ControlCenterAdmin role required", 403 - return f(*args, **kwargs) - return decorated_function + + if access_level == 'dashboard' and require_member_of_control_center_dashboard_reader: + # Dashboard reader role required but user doesn't have it + is_api_request = (request.accept_mimetypes.accept_json and not request.accept_mimetypes.accept_html) or request.path.startswith('/api/') + if is_api_request: + return jsonify({"error": "Forbidden", "message": "Insufficient permissions (ControlCenterDashboardReader role required)"}), 403 + else: + return "Forbidden: ControlCenterDashboardReader role required", 403 + + # No role requirements enabled → allow all admins (default behavior) + return f(*args, **kwargs) + return decorated_function + return decorator def create_group_role_required(f): @wraps(f) @@ -761,7 +835,7 @@ def get_user_profile_image(): """ token = get_valid_access_token() if not token: - print("get_user_profile_image: Could not acquire access token") + debug_print("get_user_profile_image: Could not acquire access token") return None # Determine the correct Graph endpoint based on Azure environment @@ -792,15 +866,15 @@ def get_user_profile_image(): elif response.status_code == 404: # User has no profile image - print("get_user_profile_image: User has no profile image") + debug_print("get_user_profile_image: User has no profile image") return None else: - print(f"get_user_profile_image: Failed to fetch profile image. Status: {response.status_code}") + debug_print(f"get_user_profile_image: Failed to fetch profile image. Status: {response.status_code}") return None except requests.exceptions.RequestException as e: - print(f"get_user_profile_image: Request failed: {e}") + debug_print(f"get_user_profile_image: Request failed: {e}") return None except Exception as e: - print(f"get_user_profile_image: Unexpected error: {e}") + debug_print(f"get_user_profile_image: Unexpected error: {e}") return None diff --git a/application/single_app/functions_documents.py b/application/single_app/functions_documents.py index fd21bc81..017b819f 100644 --- a/application/single_app/functions_documents.py +++ b/application/single_app/functions_documents.py @@ -7,6 +7,7 @@ from functions_logging import * from functions_authentication import * from functions_debug import * +import azure.cognitiveservices.speech as speechsdk def allowed_file(filename, allowed_extensions=None): if not allowed_extensions: @@ -1687,6 +1688,57 @@ def get_document_metadata_for_citations(document_id, user_id=None, group_id=None # This is expected for documents without metadata return None +def get_document_metadata_for_citations(document_id, user_id=None, group_id=None, public_workspace_id=None): + """ + Retrieve keywords and abstract from a document for creating metadata citations. + Used to enhance search results with additional context from document metadata. + + Args: + document_id: The document's unique identifier + user_id: User ID (for personal documents) + group_id: Group ID (for group documents) + public_workspace_id: Public workspace ID (for public documents) + + Returns: + dict: Dictionary with 'keywords' and 'abstract' fields, or None if document not found + """ + is_group = group_id is not None + is_public_workspace = public_workspace_id is not None + + # Determine the correct container + if is_public_workspace: + cosmos_container = cosmos_public_documents_container + elif is_group: + cosmos_container = cosmos_group_documents_container + else: + cosmos_container = cosmos_user_documents_container + + try: + # Read the document directly by ID + document_item = cosmos_container.read_item( + item=document_id, + partition_key=document_id + ) + + # Extract keywords and abstract + keywords = document_item.get('keywords', []) + abstract = document_item.get('abstract', '') + + # Return only if we have actual content + if keywords or abstract: + return { + 'keywords': keywords if keywords else [], + 'abstract': abstract if abstract else '', + 'file_name': document_item.get('file_name', 'Unknown') + } + + return None + + except Exception as e: + # Document not found or error reading - return None silently + # This is expected for documents without metadata + return None + def get_all_chunks(document_id, user_id, group_id=None, public_workspace_id=None): is_group = group_id is not None is_public_workspace = public_workspace_id is not None @@ -3801,6 +3853,355 @@ def process_doc(document_id, user_id, temp_file_path, original_filename, enable_ return total_chunks_saved, total_embedding_tokens, embedding_model_name +def process_xml(document_id, user_id, temp_file_path, original_filename, enable_enhanced_citations, update_callback, group_id=None, public_workspace_id=None): + """Processes XML files using RecursiveCharacterTextSplitter for structured content.""" + is_group = group_id is not None + is_public_workspace = public_workspace_id is not None + + update_callback(status="Processing XML file...") + total_chunks_saved = 0 + # Character-based chunking for XML structure preservation + max_chunk_size_chars = 4000 + + if enable_enhanced_citations: + args = { + "temp_file_path": temp_file_path, + "user_id": user_id, + "document_id": document_id, + "blob_filename": original_filename, + "update_callback": update_callback + } + + if is_group: + args["group_id"] = group_id + elif is_public_workspace: + args["public_workspace_id"] = public_workspace_id + + upload_to_blob(**args) + + try: + # Read XML content + try: + with open(temp_file_path, 'r', encoding='utf-8') as f: + xml_content = f.read() + except Exception as e: + raise Exception(f"Error reading XML file {original_filename}: {e}") + + # Use RecursiveCharacterTextSplitter with XML-aware separators + # This preserves XML structure better than simple word splitting + xml_splitter = RecursiveCharacterTextSplitter( + chunk_size=max_chunk_size_chars, + chunk_overlap=0, + length_function=len, + separators=["\n\n", "\n", ">", " ", ""], # XML-friendly separators + is_separator_regex=False + ) + + # Split the XML content + final_chunks = xml_splitter.split_text(xml_content) + + initial_chunk_count = len(final_chunks) + update_callback(number_of_pages=initial_chunk_count) + + for idx, chunk_content in enumerate(final_chunks, start=1): + # Skip empty chunks + if not chunk_content or not chunk_content.strip(): + print(f"Skipping empty XML chunk {idx}/{initial_chunk_count}") + continue + + update_callback( + current_file_chunk=idx, + status=f"Saving chunk {idx}/{initial_chunk_count}..." + ) + args = { + "page_text_content": chunk_content, + "page_number": total_chunks_saved + 1, + "file_name": original_filename, + "user_id": user_id, + "document_id": document_id + } + + if is_public_workspace: + args["public_workspace_id"] = public_workspace_id + elif is_group: + args["group_id"] = group_id + + save_chunks(**args) + total_chunks_saved += 1 + + # Final update with actual chunks saved + if total_chunks_saved != initial_chunk_count: + update_callback(number_of_pages=total_chunks_saved) + print(f"Adjusted final chunk count from {initial_chunk_count} to {total_chunks_saved} after skipping empty chunks.") + + except Exception as e: + print(f"Error during XML processing for {original_filename}: {type(e).__name__}: {e}") + raise Exception(f"Failed processing XML file {original_filename}: {e}") + + return total_chunks_saved + +def process_yaml(document_id, user_id, temp_file_path, original_filename, enable_enhanced_citations, update_callback, group_id=None, public_workspace_id=None): + """Processes YAML files using RecursiveCharacterTextSplitter for structured content.""" + is_group = group_id is not None + is_public_workspace = public_workspace_id is not None + + update_callback(status="Processing YAML file...") + total_chunks_saved = 0 + # Character-based chunking for YAML structure preservation + max_chunk_size_chars = 4000 + + if enable_enhanced_citations: + args = { + "temp_file_path": temp_file_path, + "user_id": user_id, + "document_id": document_id, + "blob_filename": original_filename, + "update_callback": update_callback + } + + if is_public_workspace: + args["public_workspace_id"] = public_workspace_id + elif is_group: + args["group_id"] = group_id + + upload_to_blob(**args) + + try: + # Read YAML content + try: + with open(temp_file_path, 'r', encoding='utf-8') as f: + yaml_content = f.read() + except Exception as e: + raise Exception(f"Error reading YAML file {original_filename}: {e}") + + # Use RecursiveCharacterTextSplitter with YAML-aware separators + # This preserves YAML structure better than simple word splitting + yaml_splitter = RecursiveCharacterTextSplitter( + chunk_size=max_chunk_size_chars, + chunk_overlap=0, + length_function=len, + separators=["\n\n", "\n", "- ", " ", ""], # YAML-friendly separators + is_separator_regex=False + ) + + # Split the YAML content + final_chunks = yaml_splitter.split_text(yaml_content) + + initial_chunk_count = len(final_chunks) + update_callback(number_of_pages=initial_chunk_count) + + for idx, chunk_content in enumerate(final_chunks, start=1): + # Skip empty chunks + if not chunk_content or not chunk_content.strip(): + print(f"Skipping empty YAML chunk {idx}/{initial_chunk_count}") + continue + + update_callback( + current_file_chunk=idx, + status=f"Saving chunk {idx}/{initial_chunk_count}..." + ) + args = { + "page_text_content": chunk_content, + "page_number": total_chunks_saved + 1, + "file_name": original_filename, + "user_id": user_id, + "document_id": document_id + } + + if is_public_workspace: + args["public_workspace_id"] = public_workspace_id + elif is_group: + args["group_id"] = group_id + + save_chunks(**args) + total_chunks_saved += 1 + + # Final update with actual chunks saved + if total_chunks_saved != initial_chunk_count: + update_callback(number_of_pages=total_chunks_saved) + print(f"Adjusted final chunk count from {initial_chunk_count} to {total_chunks_saved} after skipping empty chunks.") + + except Exception as e: + print(f"Error during YAML processing for {original_filename}: {type(e).__name__}: {e}") + raise Exception(f"Failed processing YAML file {original_filename}: {e}") + + return total_chunks_saved + +def process_log(document_id, user_id, temp_file_path, original_filename, enable_enhanced_citations, update_callback, group_id=None, public_workspace_id=None): + """Processes LOG files using line-based chunking to maintain log record integrity.""" + is_group = group_id is not None + is_public_workspace = public_workspace_id is not None + + update_callback(status="Processing LOG file...") + total_chunks_saved = 0 + target_words_per_chunk = 1000 # Word-based chunking for better semantic grouping + + if enable_enhanced_citations: + args = { + "temp_file_path": temp_file_path, + "user_id": user_id, + "document_id": document_id, + "blob_filename": original_filename, + "update_callback": update_callback + } + + if is_public_workspace: + args["public_workspace_id"] = public_workspace_id + elif is_group: + args["group_id"] = group_id + + upload_to_blob(**args) + + try: + with open(temp_file_path, 'r', encoding='utf-8') as f: + content = f.read() + + # Split by lines to maintain log record integrity + lines = content.splitlines(keepends=True) # Keep line endings + + if not lines: + raise Exception(f"LOG file {original_filename} is empty") + + # Chunk by accumulating lines until reaching target word count + final_chunks = [] + current_chunk_lines = [] + current_chunk_word_count = 0 + + for line in lines: + line_word_count = len(line.split()) + + # If adding this line exceeds target AND we already have content + if current_chunk_word_count + line_word_count > target_words_per_chunk and current_chunk_lines: + # Finalize current chunk + final_chunks.append("".join(current_chunk_lines)) + # Start new chunk with current line + current_chunk_lines = [line] + current_chunk_word_count = line_word_count + else: + # Add line to current chunk + current_chunk_lines.append(line) + current_chunk_word_count += line_word_count + + # Add the last remaining chunk if it has content + if current_chunk_lines: + final_chunks.append("".join(current_chunk_lines)) + + num_chunks = len(final_chunks) + update_callback(number_of_pages=num_chunks) + + for idx, chunk_content in enumerate(final_chunks, start=1): + if chunk_content.strip(): + update_callback( + current_file_chunk=idx, + status=f"Saving chunk {idx}/{num_chunks}..." + ) + args = { + "page_text_content": chunk_content, + "page_number": idx, + "file_name": original_filename, + "user_id": user_id, + "document_id": document_id + } + + if is_public_workspace: + args["public_workspace_id"] = public_workspace_id + elif is_group: + args["group_id"] = group_id + + save_chunks(**args) + total_chunks_saved += 1 + + except Exception as e: + raise Exception(f"Failed processing LOG file {original_filename}: {e}") + + return total_chunks_saved + +def process_doc(document_id, user_id, temp_file_path, original_filename, enable_enhanced_citations, update_callback, group_id=None, public_workspace_id=None): + """ + Processes .doc and .docm files using docx2txt library. + Note: .docx files still use Document Intelligence for better formatting preservation. + """ + is_group = group_id is not None + is_public_workspace = public_workspace_id is not None + + update_callback(status=f"Processing {original_filename.split('.')[-1].upper()} file...") + total_chunks_saved = 0 + target_words_per_chunk = 400 # Consistent with other text-based chunking + + if enable_enhanced_citations: + args = { + "temp_file_path": temp_file_path, + "user_id": user_id, + "document_id": document_id, + "blob_filename": original_filename, + "update_callback": update_callback + } + + if is_public_workspace: + args["public_workspace_id"] = public_workspace_id + elif is_group: + args["group_id"] = group_id + + upload_to_blob(**args) + + try: + # Import docx2txt here to avoid dependency issues if not installed + try: + import docx2txt + except ImportError: + raise Exception("docx2txt library is required for .doc and .docm file processing. Install with: pip install docx2txt") + + # Extract text from .doc or .docm file + try: + text_content = docx2txt.process(temp_file_path) + except Exception as e: + raise Exception(f"Error extracting text from {original_filename}: {e}") + + if not text_content or not text_content.strip(): + raise Exception(f"No text content extracted from {original_filename}") + + # Split into words for chunking + words = text_content.split() + if not words: + raise Exception(f"No text content found in {original_filename}") + + # Create chunks of target_words_per_chunk words + final_chunks = [] + for i in range(0, len(words), target_words_per_chunk): + chunk_words = words[i:i + target_words_per_chunk] + chunk_text = " ".join(chunk_words) + final_chunks.append(chunk_text) + + num_chunks = len(final_chunks) + update_callback(number_of_pages=num_chunks) + + for idx, chunk_content in enumerate(final_chunks, start=1): + if chunk_content.strip(): + update_callback( + current_file_chunk=idx, + status=f"Saving chunk {idx}/{num_chunks}..." + ) + args = { + "page_text_content": chunk_content, + "page_number": idx, + "file_name": original_filename, + "user_id": user_id, + "document_id": document_id + } + + if is_public_workspace: + args["public_workspace_id"] = public_workspace_id + elif is_group: + args["group_id"] = group_id + + save_chunks(**args) + total_chunks_saved += 1 + + except Exception as e: + raise Exception(f"Failed processing {original_filename}: {e}") + + return total_chunks_saved + def process_html(document_id, user_id, temp_file_path, original_filename, enable_enhanced_citations, update_callback, group_id=None, public_workspace_id=None): """Processes HTML files.""" is_group = group_id is not None @@ -5228,6 +5629,88 @@ def update_doc_callback(**kwargs): # Don't fail if flag setting fails except Exception as log_error: + print(f"Error logging document creation transaction: {log_error}") + # Don't fail the entire process if logging fails + + # Create notification for document processing completion + try: + from functions_notifications import create_notification, create_group_notification, create_public_workspace_notification + + notification_title = f"Document ready: {original_filename}" + notification_message = f"Your document has been processed successfully with {total_chunks_saved} chunks." + + # Determine workspace type and create appropriate notification + if public_workspace_id: + # Notification for all public workspace members + create_public_workspace_notification( + public_workspace_id=public_workspace_id, + notification_type='document_processing_complete', + title=notification_title, + message=notification_message, + link_url='/public_directory', + link_context={ + 'workspace_type': 'public', + 'public_workspace_id': public_workspace_id, + 'document_id': document_id + }, + metadata={ + 'document_id': document_id, + 'file_name': original_filename, + 'chunks': total_chunks_saved + } + ) + print(f"📢 Created notification for public workspace {public_workspace_id}") + + elif group_id: + # Notification for all group members - get group name + from functions_group import find_group_by_id + group = find_group_by_id(group_id) + group_name = group.get('name', 'Unknown Group') if group else 'Unknown Group' + + create_group_notification( + group_id=group_id, + notification_type='document_processing_complete', + title=notification_title, + message=f"Document uploaded to {group_name} has been processed successfully with {total_chunks_saved} chunks.", + link_url='/group_workspaces', + link_context={ + 'workspace_type': 'group', + 'group_id': group_id, + 'document_id': document_id + }, + metadata={ + 'document_id': document_id, + 'file_name': original_filename, + 'chunks': total_chunks_saved, + 'group_name': group_name, + 'group_id': group_id + } + ) + print(f"📢 Created notification for group {group_id} ({group_name})") + + else: + # Personal notification for the uploader + create_notification( + user_id=user_id, + notification_type='document_processing_complete', + title=notification_title, + message=notification_message, + link_url='/workspace', + link_context={ + 'workspace_type': 'personal', + 'document_id': document_id + }, + metadata={ + 'document_id': document_id, + 'file_name': original_filename, + 'chunks': total_chunks_saved + } + ) + print(f"📢 Created notification for user {user_id}") + + except Exception as notif_error: + print(f"⚠️ Warning: Failed to create notification: {notif_error}") + # Don't fail the entire process if notification creation fails print(f"⚠️ Warning: Failed to log document creation transaction: {log_error}") # Don't fail the document processing if logging fails diff --git a/application/single_app/functions_group.py b/application/single_app/functions_group.py index 195e268e..dcf63420 100644 --- a/application/single_app/functions_group.py +++ b/application/single_app/functions_group.py @@ -189,4 +189,85 @@ def is_user_in_group(group_doc, user_id): for u in group_doc.get("users", []): if u["userId"] == user_id: return True - return False \ No newline at end of file + return False + + +def check_group_status_allows_operation(group_doc, operation_type): + """ + Check if the group's status allows the specified operation. + + Args: + group_doc: The group document from Cosmos DB + operation_type: One of 'upload', 'delete', 'chat', 'view' + + Returns: + tuple: (allowed: bool, reason: str) + + Status definitions: + - active: All operations allowed + - locked: Read-only mode (view and chat only, no modifications) + - upload_disabled: No new uploads, but deletions and chat allowed + - inactive: No operations allowed except admin viewing + """ + if not group_doc: + return False, "Group not found" + + status = group_doc.get('status', 'active') # Default to 'active' if not set + + # Define what each status allows + status_permissions = { + 'active': { + 'upload': True, + 'delete': True, + 'chat': True, + 'view': True + }, + 'locked': { + 'upload': False, + 'delete': False, + 'chat': True, + 'view': True + }, + 'upload_disabled': { + 'upload': False, + 'delete': True, + 'chat': True, + 'view': True + }, + 'inactive': { + 'upload': False, + 'delete': False, + 'chat': False, + 'view': False + } + } + + # Get permissions for current status + permissions = status_permissions.get(status, status_permissions['active']) + + # Check if operation is allowed + allowed = permissions.get(operation_type, False) + + # Generate helpful reason message if not allowed + if not allowed: + reasons = { + 'locked': { + 'upload': 'This group is locked (read-only mode). Document uploads are disabled.', + 'delete': 'This group is locked (read-only mode). Document deletions are disabled.' + }, + 'upload_disabled': { + 'upload': 'Document uploads are disabled for this group.' + }, + 'inactive': { + 'upload': 'This group is inactive. All operations are disabled.', + 'delete': 'This group is inactive. All operations are disabled.', + 'chat': 'This group is inactive. All operations are disabled.', + 'view': 'This group is inactive. Access is restricted to administrators.' + } + } + + reason = reasons.get(status, {}).get(operation_type, + f'This operation is not allowed when group status is "{status}".') + return False, reason + + return True, "" \ No newline at end of file diff --git a/application/single_app/functions_group_actions.py b/application/single_app/functions_group_actions.py index 0dc0c3dd..bc6aa4ea 100644 --- a/application/single_app/functions_group_actions.py +++ b/application/single_app/functions_group_actions.py @@ -6,7 +6,7 @@ import uuid from datetime import datetime from typing import Any, Dict, List, Optional - +from functions_debug import debug_print from azure.cosmos import exceptions from flask import current_app @@ -42,7 +42,7 @@ def get_group_actions( except exceptions.CosmosResourceNotFoundError: return [] except Exception as exc: - current_app.logger.error( + debug_print( "Error fetching group actions for %s: %s", group_id, exc ) return [] @@ -74,7 +74,7 @@ def get_group_action( return None action = actions[0] except Exception as exc: - current_app.logger.error( + debug_print( "Error fetching group action %s for %s: %s", action_id, group_id, exc ) return None @@ -113,7 +113,7 @@ def save_group_action(group_id: str, action_data: Dict[str, Any]) -> Dict[str, A stored = cosmos_group_actions_container.upsert_item(body=payload) return _clean_action(stored, group_id, SecretReturnType.TRIGGER) except Exception as exc: - current_app.logger.error( + debug_print( "Error saving group action %s for %s: %s", action_id, group_id, exc ) raise @@ -137,7 +137,7 @@ def delete_group_action(group_id: str, action_id: str) -> bool: ) return True except Exception as exc: - current_app.logger.error( + debug_print( "Error deleting group action %s for %s: %s", action_id, group_id, exc ) raise diff --git a/application/single_app/functions_group_agents.py b/application/single_app/functions_group_agents.py index e8d34df4..76448098 100644 --- a/application/single_app/functions_group_agents.py +++ b/application/single_app/functions_group_agents.py @@ -6,7 +6,7 @@ import uuid from datetime import datetime from typing import Any, Dict, List, Optional - +from functions_debug import debug_print from azure.cosmos import exceptions from flask import current_app @@ -39,7 +39,7 @@ def get_group_agents(group_id: str) -> List[Dict[str, Any]]: except exceptions.CosmosResourceNotFoundError: return [] except Exception as exc: - current_app.logger.error( + debug_print( "Error fetching group agents for %s: %s", group_id, exc ) return [] @@ -56,7 +56,7 @@ def get_group_agent(group_id: str, agent_id: str) -> Optional[Dict[str, Any]]: except exceptions.CosmosResourceNotFoundError: return None except Exception as exc: - current_app.logger.error( + debug_print( "Error fetching group agent %s for %s: %s", agent_id, group_id, exc ) return None @@ -111,7 +111,7 @@ def save_group_agent(group_id: str, agent_data: Dict[str, Any]) -> Dict[str, Any stored = cosmos_group_agents_container.upsert_item(body=payload) return _clean_agent(stored) except Exception as exc: - current_app.logger.error( + debug_print( "Error saving group agent %s for %s: %s", agent_id, group_id, exc ) raise @@ -135,7 +135,7 @@ def delete_group_agent(group_id: str, agent_id: str) -> bool: ) return True except Exception as exc: - current_app.logger.error( + debug_print( "Error deleting group agent %s for %s: %s", agent_id, group_id, exc ) raise diff --git a/application/single_app/functions_notifications.py b/application/single_app/functions_notifications.py new file mode 100644 index 00000000..15ce11e4 --- /dev/null +++ b/application/single_app/functions_notifications.py @@ -0,0 +1,579 @@ +# functions_notifications.py + +""" +Notifications Management + +This module handles all operations related to notifications stored in the +notifications container. Supports personal, group, and public workspace scoped +notifications with per-user read/dismiss tracking. + +Version: 0.234.032 +Implemented in: 0.234.032 +""" + +# Imports (grouped after docstring) +import uuid +from datetime import datetime, timezone +from azure.cosmos import exceptions +from flask import current_app +import logging +from config import cosmos_notifications_container +from functions_group import find_group_by_id +from functions_debug import debug_print +from functions_public_workspaces import find_public_workspace_by_id, get_user_public_workspaces + +# Constants +TTL_60_DAYS = 60 * 24 * 60 * 60 # 60 days in seconds (5184000) + +# Notification type registry for extensibility +NOTIFICATION_TYPES = { + 'document_processing_complete': { + 'icon': 'bi-file-earmark-check', + 'color': 'success' + }, + 'document_processing_failed': { + 'icon': 'bi-file-earmark-x', + 'color': 'danger' + }, + 'ownership_transfer_request': { + 'icon': 'bi-arrow-left-right', + 'color': 'warning' + }, + 'group_deletion_request': { + 'icon': 'bi-trash', + 'color': 'danger' + }, + 'document_deletion_request': { + 'icon': 'bi-trash', + 'color': 'warning' + }, + 'system_announcement': { + 'icon': 'bi-megaphone', + 'color': 'info' + } +} + + +def create_notification( + user_id=None, + group_id=None, + public_workspace_id=None, + notification_type='system_announcement', + title='', + message='', + link_url='', + link_context=None, + metadata=None, + assignment=None +): + """ + Create a notification for personal, group, or public workspace scope. + + Args: + user_id (str, optional): User ID for personal notifications (deprecated if using assignment) + group_id (str, optional): Group ID for group-scoped notifications + public_workspace_id (str, optional): Public workspace ID for workspace notifications + notification_type (str): Type of notification (must be in NOTIFICATION_TYPES) + title (str): Notification title + message (str): Notification message + link_url (str): URL to navigate to when clicked + link_context (dict, optional): Additional context for navigation + metadata (dict, optional): Flexible metadata for type-specific data + assignment (dict, optional): Role and ownership-based assignment: + { + 'roles': ['Admin', 'ControlCenterAdmin'], # Users with these roles see notification + 'personal_workspace_owner_id': 'user123', # Personal workspace owner + 'group_owner_id': 'user456', # Group owner + 'public_workspace_owner_id': 'user789' # Public workspace owner + } + If any role matches or any owner ID matches user's ID, notification is visible. + + Returns: + dict: Created notification document or None on error + """ + try: + # Determine scope and partition key + scope = 'personal' + partition_key = user_id + + # If assignment is provided, always use assignment partition for role-based notifications + if assignment: + # Assignment-based notifications always use the special assignment partition + # This allows role-based filtering across all users + scope = 'assignment' + partition_key = 'assignment-notifications' + else: + # Legacy behavior - partition by specific workspace + if group_id: + scope = 'group' + partition_key = group_id + elif public_workspace_id: + scope = 'public_workspace' + partition_key = public_workspace_id + + if not partition_key: + debug_print("create_notification: No partition key provided") + return None + + # Validate notification type + if notification_type not in NOTIFICATION_TYPES: + debug_print(f"Unknown notification type: {notification_type}") + + notification_doc = { + 'id': str(uuid.uuid4()), + 'user_id': user_id, + 'group_id': group_id, + 'public_workspace_id': public_workspace_id, + 'scope': scope, + 'notification_type': notification_type, + 'title': title, + 'message': message, + 'created_at': datetime.now(timezone.utc).isoformat(), + 'ttl': TTL_60_DAYS, + 'read_by': [], + 'dismissed_by': [], + 'link_url': link_url or '', + 'link_context': link_context or {}, + 'metadata': metadata or {}, + 'assignment': assignment or None + } + + # Create in Cosmos with partition key based on scope + cosmos_notifications_container.create_item(notification_doc) + + debug_print( + f"Notification created: {notification_doc['id']} " + f"[{scope}] [{notification_type}] for partition: {partition_key}" + ) + + return notification_doc + + except Exception as e: + debug_print(f"Error creating notification: {e}") + return None + + +def create_group_notification(group_id, notification_type, title, message, link_url='', link_context=None, metadata=None): + """ + Create a notification for all members of a group. + + Args: + group_id (str): Group ID + notification_type (str): Type of notification + title (str): Notification title + message (str): Notification message + link_url (str): URL to navigate to when clicked + link_context (dict, optional): Additional context for navigation + metadata (dict, optional): Additional metadata + + Returns: + dict: Created notification or None on error + """ + return create_notification( + group_id=group_id, + notification_type=notification_type, + title=title, + message=message, + link_url=link_url, + link_context=link_context or {'workspace_type': 'group', 'group_id': group_id}, + metadata=metadata + ) + + +def create_public_workspace_notification( + public_workspace_id, + notification_type, + title, + message, + link_url='', + link_context=None, + metadata=None +): + """ + Create a notification for all members of a public workspace. + + Args: + public_workspace_id (str): Public workspace ID + notification_type (str): Type of notification + title (str): Notification title + message (str): Notification message + link_url (str): URL to navigate to when clicked + link_context (dict, optional): Additional context for navigation + metadata (dict, optional): Additional metadata + + Returns: + dict: Created notification or None on error + """ + return create_notification( + public_workspace_id=public_workspace_id, + notification_type=notification_type, + title=title, + message=message, + link_url=link_url, + link_context=link_context or { + 'workspace_type': 'public', + 'public_workspace_id': public_workspace_id + }, + metadata=metadata + ) + + +def get_user_notifications(user_id, page=1, per_page=20, include_read=True, include_dismissed=False, user_roles=None): + """ + Fetch notifications visible to a user from personal, group, and public workspace scopes. + Supports assignment-based notifications that target users by roles and/or ownership. + + Args: + user_id (str): User's unique identifier + page (int): Page number (1-indexed) + per_page (int): Items per page + include_read (bool): Include notifications already read by user + include_dismissed (bool): Include notifications dismissed by user + user_roles (list, optional): User's roles for assignment-based notifications + + Returns: + dict: { + 'notifications': [...], + 'total': int, + 'page': int, + 'per_page': int, + 'has_more': bool + } + """ + try: + all_notifications = [] + + # 1. Fetch personal notifications + personal_query = "SELECT * FROM c WHERE c.user_id = @user_id" + personal_params = [{"name": "@user_id", "value": user_id}] + + personal_notifications = list(cosmos_notifications_container.query_items( + query=personal_query, + parameters=personal_params, + partition_key=user_id + )) + all_notifications.extend(personal_notifications) + + # 2. Fetch group notifications for user's groups + from functions_group import get_user_groups + user_groups = get_user_groups(user_id) + + for group in user_groups: + group_id = group['id'] + group_query = "SELECT * FROM c WHERE c.group_id = @group_id" + group_params = [{"name": "@group_id", "value": group_id}] + + group_notifications = list(cosmos_notifications_container.query_items( + query=group_query, + parameters=group_params, + enable_cross_partition_query=True + )) + all_notifications.extend(group_notifications) + + # 3. Fetch public workspace notifications + from functions_public_workspaces import get_user_public_workspaces + user_workspaces = get_user_public_workspaces(user_id) + + for workspace in user_workspaces: + workspace_id = workspace['id'] + workspace_query = "SELECT * FROM c WHERE c.public_workspace_id = @workspace_id" + workspace_params = [{"name": "@workspace_id", "value": workspace_id}] + + workspace_notifications = list(cosmos_notifications_container.query_items( + query=workspace_query, + parameters=workspace_params, + enable_cross_partition_query=True + )) + all_notifications.extend(workspace_notifications) + + # 4. Fetch assignment-based notifications + assignment_query = "SELECT * FROM c WHERE c.scope = 'assignment'" + assignment_notifications = list(cosmos_notifications_container.query_items( + query=assignment_query, + enable_cross_partition_query=True + )) + + # Filter assignment notifications based on user's roles and ownership + for notif in assignment_notifications: + assignment = notif.get('assignment') + if not assignment: + continue + + # Check if user matches assignment criteria + user_matches = False + + # Check roles + if user_roles and assignment.get('roles'): + for role in assignment.get('roles', []): + if role in user_roles: + user_matches = True + break + + # Check ownership IDs + if not user_matches: + if assignment.get('personal_workspace_owner_id') == user_id: + user_matches = True + elif assignment.get('group_owner_id') == user_id: + user_matches = True + elif assignment.get('public_workspace_owner_id') == user_id: + user_matches = True + + if user_matches: + all_notifications.append(notif) + + # Filter based on read/dismissed status + filtered_notifications = [] + for notif in all_notifications: + notif_id = notif.get('id', 'unknown') + read_by = notif.get('read_by', []) + dismissed_by = notif.get('dismissed_by', []) + + if not include_dismissed and user_id in dismissed_by: + continue + if not include_read and user_id in read_by: + continue + + # Add UI metadata + notif['is_read'] = user_id in read_by + notif['is_dismissed'] = user_id in dismissed_by + notif['type_config'] = NOTIFICATION_TYPES.get( + notif.get('notification_type'), + NOTIFICATION_TYPES['system_announcement'] + ) + + filtered_notifications.append(notif) + + # Sort by created_at descending (newest first) + filtered_notifications.sort( + key=lambda x: x.get('created_at', ''), + reverse=True + ) + + # Pagination + total = len(filtered_notifications) + start_idx = (page - 1) * per_page + end_idx = start_idx + per_page + paginated = filtered_notifications[start_idx:end_idx] + + return { + 'notifications': paginated, + 'total': total, + 'page': page, + 'per_page': per_page, + 'has_more': end_idx < total + } + + except Exception as e: + debug_print(f"Error fetching notifications for user {user_id}: {e}") + return { + 'notifications': [], + 'total': 0, + 'page': page, + 'per_page': per_page, + 'has_more': False + } + + +def get_unread_notification_count(user_id): + """ + Get count of unread notifications for a user across all scopes. + + Args: + user_id (str): User's unique identifier + + Returns: + int: Count of unread notifications (capped at 10 for efficiency) + """ + try: + # Get notifications without pagination + result = get_user_notifications( + user_id=user_id, + page=1, + per_page=10, # Only need first 10 for badge display + include_read=False, + include_dismissed=False + ) + + return min(result['total'], 10) # Cap at 10 for display purposes + + except Exception as e: + debug_print(f"Error counting unread notifications for {user_id}: {e}") + return 0 + + +def mark_notification_read(notification_id, user_id): + """ + Mark a notification as read by a specific user. + + Args: + notification_id (str): Notification ID + user_id (str): User ID marking as read + + Returns: + bool: True if successful, False otherwise + """ + try: + # First, find the notification across all partition keys + query = "SELECT * FROM c WHERE c.id = @notification_id" + params = [{"name": "@notification_id", "value": notification_id}] + + notifications = list(cosmos_notifications_container.query_items( + query=query, + parameters=params, + enable_cross_partition_query=True + )) + + if not notifications: + debug_print(f"Notification {notification_id} not found") + return False + + notification = notifications[0] + + # Determine partition key + partition_key = notification.get('user_id') or notification.get('group_id') or notification.get('public_workspace_id') + + if not partition_key: + debug_print(f"No partition key found for notification {notification_id}") + return False + + # Add user to read_by if not already present + read_by = notification.get('read_by', []) + if user_id not in read_by: + read_by.append(user_id) + notification['read_by'] = read_by + + cosmos_notifications_container.upsert_item(notification) + debug_print(f"Notification {notification_id} marked read by {user_id}") + + return True + + except Exception as e: + debug_print(f"Error marking notification {notification_id} as read: {e}") + return False + + +def dismiss_notification(notification_id, user_id): + """ + Dismiss a notification for a specific user (adds to dismissed_by). + + Args: + notification_id (str): Notification ID + user_id (str): User ID dismissing the notification + + Returns: + bool: True if successful, False otherwise + """ + try: + # Find notification across all partitions + query = "SELECT * FROM c WHERE c.id = @notification_id" + params = [{"name": "@notification_id", "value": notification_id}] + + notifications = list(cosmos_notifications_container.query_items( + query=query, + parameters=params, + enable_cross_partition_query=True + )) + + if not notifications: + debug_print(f"Notification {notification_id} not found") + return False + + notification = notifications[0] + + # Determine partition key + partition_key = notification.get('user_id') or notification.get('group_id') or notification.get('public_workspace_id') + + if not partition_key: + debug_print(f"No partition key found for notification {notification_id}") + return False + + # Add user to dismissed_by + dismissed_by = notification.get('dismissed_by', []) + if user_id not in dismissed_by: + dismissed_by.append(user_id) + notification['dismissed_by'] = dismissed_by + + cosmos_notifications_container.upsert_item(notification) + debug_print(f"Notification {notification_id} dismissed by {user_id}") + + return True + + except Exception as e: + debug_print(f"Error dismissing notification {notification_id}: {e}") + return False + + +def mark_all_read(user_id): + """ + Mark all unread notifications as read for a user. + + Args: + user_id (str): User's unique identifier + + Returns: + int: Number of notifications marked as read + """ + try: + # Get all unread notifications + result = get_user_notifications( + user_id=user_id, + page=1, + per_page=1000, # Get all unread + include_read=False, + include_dismissed=True + ) + + count = 0 + for notification in result['notifications']: + if mark_notification_read(notification['id'], user_id): + count += 1 + + debug_print(f"Marked {count} notifications as read for user {user_id}") + return count + + except Exception as e: + debug_print(f"Error marking all notifications as read for {user_id}: {e}") + return 0 + + +def delete_notification(notification_id): + """ + Permanently delete a notification (admin only). + + Args: + notification_id (str): Notification ID to delete + + Returns: + bool: True if successful, False otherwise + """ + try: + # Find notification to get partition key + query = "SELECT * FROM c WHERE c.id = @notification_id" + params = [{"name": "@notification_id", "value": notification_id}] + + notifications = list(cosmos_notifications_container.query_items( + query=query, + parameters=params, + enable_cross_partition_query=True + )) + + if not notifications: + return False + + notification = notifications[0] + partition_key = notification.get('user_id') or notification.get('group_id') or notification.get('public_workspace_id') + + if not partition_key: + return False + + cosmos_notifications_container.delete_item( + item=notification_id, + partition_key=partition_key + ) + + debug_print(f"Notification {notification_id} permanently deleted") + return True + + except Exception as e: + debug_print(f"Error deleting notification {notification_id}: {e}") + return False diff --git a/application/single_app/functions_personal_actions.py b/application/single_app/functions_personal_actions.py index 108d3151..6345438e 100644 --- a/application/single_app/functions_personal_actions.py +++ b/application/single_app/functions_personal_actions.py @@ -13,6 +13,7 @@ from flask import current_app from functions_keyvault import keyvault_plugin_save_helper, keyvault_plugin_get_helper, keyvault_plugin_delete_helper, SecretReturnType from functions_settings import get_user_settings, update_user_settings +from functions_debug import debug_print from config import cosmos_personal_actions_container import logging @@ -47,7 +48,7 @@ def get_personal_actions(user_id, return_type=SecretReturnType.TRIGGER): except exceptions.CosmosResourceNotFoundError: return [] except Exception as e: - current_app.logger.error(f"Error fetching personal actions for user {user_id}: {e}") + debug_print(f"Error fetching personal actions for user {user_id}: {e}") return [] def get_personal_action(user_id, action_id, return_type=SecretReturnType.TRIGGER): @@ -91,7 +92,7 @@ def get_personal_action(user_id, action_id, return_type=SecretReturnType.TRIGGER return cleaned_action except Exception as e: - current_app.logger.error(f"Error fetching action {action_id} for user {user_id}: {e}") + debug_print(f"Error fetching action {action_id} for user {user_id}: {e}") return None def save_personal_action(user_id, action_data): @@ -151,7 +152,7 @@ def save_personal_action(user_id, action_data): return cleaned_result except Exception as e: - current_app.logger.error(f"Error saving action for user {user_id}: {e}") + debug_print(f"Error saving action for user {user_id}: {e}") raise def delete_personal_action(user_id, action_id): @@ -182,7 +183,7 @@ def delete_personal_action(user_id, action_id): except exceptions.CosmosResourceNotFoundError: return False except Exception as e: - current_app.logger.error(f"Error deleting action {action_id} for user {user_id}: {e}") + debug_print(f"Error deleting action {action_id} for user {user_id}: {e}") raise def ensure_migration_complete(user_id): @@ -213,13 +214,13 @@ def ensure_migration_complete(user_id): settings_to_update = user_settings.get('settings', {}) settings_to_update['plugins'] = [] # Set to empty array instead of removing update_user_settings(user_id, settings_to_update) - current_app.logger.info(f"Cleaned up legacy plugin data for user {user_id} (already migrated)") + debug_print(f"Cleaned up legacy plugin data for user {user_id} (already migrated)") return 0 return 0 except Exception as e: - current_app.logger.error(f"Error ensuring action migration complete for user {user_id}: {e}") + debug_print(f"Error ensuring action migration complete for user {user_id}: {e}") return 0 def migrate_actions_from_user_settings(user_id): @@ -245,7 +246,7 @@ def migrate_actions_from_user_settings(user_id): try: # Skip if plugin already exists in personal container if plugin.get('name') in existing_action_names: - current_app.logger.info(f"Skipping migration of plugin '{plugin.get('name')}' - already exists") + debug_print(f"Skipping migration of plugin '{plugin.get('name')}' - already exists") continue # Ensure plugin has an ID (generate GUID if missing) if 'id' not in plugin or not plugin['id']: @@ -255,18 +256,18 @@ def migrate_actions_from_user_settings(user_id): save_personal_action(user_id, plugin) migrated_count += 1 except Exception as e: - current_app.logger.error(f"Error migrating plugin {plugin.get('name', 'unknown')} for user {user_id}: {e}") + debug_print(f"Error migrating plugin {plugin.get('name', 'unknown')} for user {user_id}: {e}") # Always remove plugins from user settings after processing (even if no new ones migrated) settings_to_update = user_settings.get('settings', {}) settings_to_update['plugins'] = [] # Set to empty array instead of removing update_user_settings(user_id, settings_to_update) - current_app.logger.info(f"Migrated {migrated_count} new actions for user {user_id}, cleaned up legacy data") + debug_print(f"Migrated {migrated_count} new actions for user {user_id}, cleaned up legacy data") return migrated_count except Exception as e: - current_app.logger.error(f"Error during action migration for user {user_id}: {e}") + debug_print(f"Error during action migration for user {user_id}: {e}") return 0 def get_actions_by_names(user_id, action_names, return_type=SecretReturnType.TRIGGER): @@ -308,7 +309,7 @@ def get_actions_by_names(user_id, action_names, return_type=SecretReturnType.TRI return cleaned_actions except Exception as e: - current_app.logger.error(f"Error fetching actions by names for user {user_id}: {e}") + debug_print(f"Error fetching actions by names for user {user_id}: {e}") return [] def get_actions_by_type(user_id, action_type, return_type=SecretReturnType.TRIGGER): @@ -345,5 +346,5 @@ def get_actions_by_type(user_id, action_type, return_type=SecretReturnType.TRIGG return cleaned_actions except Exception as e: - current_app.logger.error(f"Error fetching actions by type {action_type} for user {user_id}: {e}") + debug_print(f"Error fetching actions by type {action_type} for user {user_id}: {e}") return [] diff --git a/application/single_app/functions_personal_agents.py b/application/single_app/functions_personal_agents.py index 3f2cc6ea..7462d1b4 100644 --- a/application/single_app/functions_personal_agents.py +++ b/application/single_app/functions_personal_agents.py @@ -18,6 +18,7 @@ from config import cosmos_personal_agents_container from functions_settings import get_settings, get_user_settings, update_user_settings from functions_keyvault import keyvault_agent_save_helper, keyvault_agent_get_helper, keyvault_agent_delete_helper +from functions_debug import debug_print def get_personal_agents(user_id): """ @@ -58,7 +59,7 @@ def get_personal_agents(user_id): except exceptions.CosmosResourceNotFoundError: return [] except Exception as e: - current_app.logger.error(f"Error fetching personal agents for user {user_id}: {e}") + debug_print(f"Error fetching personal agents for user {user_id}: {e}") return [] def get_personal_agent(user_id, agent_id): @@ -92,10 +93,10 @@ def get_personal_agent(user_id, agent_id): cleaned_agent.pop('reasoning_effort', None) return cleaned_agent except exceptions.CosmosResourceNotFoundError: - current_app.logger.warning(f"Agent {agent_id} not found for user {user_id}") + debug_print(f"Agent {agent_id} not found for user {user_id}") return None except Exception as e: - current_app.logger.error(f"Error fetching agent {agent_id} for user {user_id}: {e}") + debug_print(f"Error fetching agent {agent_id} for user {user_id}: {e}") return None def save_personal_agent(user_id, agent_data): @@ -153,7 +154,7 @@ def save_personal_agent(user_id, agent_data): return cleaned_result except Exception as e: - current_app.logger.error(f"Error saving agent for user {user_id}: {e}") + debug_print(f"Error saving agent for user {user_id}: {e}") raise def delete_personal_agent(user_id, agent_id): @@ -185,10 +186,10 @@ def delete_personal_agent(user_id, agent_id): ) return True except exceptions.CosmosResourceNotFoundError: - current_app.logger.warning(f"Agent {agent_id} not found for user {user_id}") + debug_print(f"Agent {agent_id} not found for user {user_id}") return False except Exception as e: - current_app.logger.error(f"Error deleting agent {agent_id} for user {user_id}: {e}") + debug_print(f"Error deleting agent {agent_id} for user {user_id}: {e}") raise def ensure_migration_complete(user_id): @@ -219,13 +220,13 @@ def ensure_migration_complete(user_id): settings_to_update = user_settings.get('settings', {}) settings_to_update['agents'] = [] # Set to empty array instead of removing update_user_settings(user_id, settings_to_update) - current_app.logger.info(f"Cleaned up legacy agent data for user {user_id} (already migrated)") + debug_print(f"Cleaned up legacy agent data for user {user_id} (already migrated)") return 0 return 0 except Exception as e: - current_app.logger.error(f"Error ensuring agent migration complete for user {user_id}: {e}") + debug_print(f"Error ensuring agent migration complete for user {user_id}: {e}") return 0 def migrate_agents_from_user_settings(user_id): @@ -249,7 +250,7 @@ def migrate_agents_from_user_settings(user_id): try: # Skip if agent already exists in personal container if agent.get('name') in existing_agent_names: - current_app.logger.info(f"Skipping migration of agent '{agent.get('name')}' - already exists") + debug_print(f"Skipping migration of agent '{agent.get('name')}' - already exists") continue # Ensure agent has an ID if 'id' not in agent: @@ -257,14 +258,14 @@ def migrate_agents_from_user_settings(user_id): save_personal_agent(user_id, agent) migrated_count += 1 except Exception as e: - current_app.logger.error(f"Error migrating agent {agent.get('name', 'unknown')} for user {user_id}: {e}") + debug_print(f"Error migrating agent {agent.get('name', 'unknown')} for user {user_id}: {e}") # Always remove agents from user settings after processing (even if no new ones migrated) settings_to_update = user_settings.get('settings', {}) settings_to_update['agents'] = [] # Set to empty array instead of removing update_user_settings(user_id, settings_to_update) - current_app.logger.info(f"Migrated {migrated_count} new agents for user {user_id}, cleaned up legacy data") + debug_print(f"Migrated {migrated_count} new agents for user {user_id}, cleaned up legacy data") return migrated_count except Exception as e: - current_app.logger.error(f"Error during agent migration for user {user_id}: {e}") + debug_print(f"Error during agent migration for user {user_id}: {e}") return 0 diff --git a/application/single_app/functions_public_workspaces.py b/application/single_app/functions_public_workspaces.py index 53abf484..45e5f80e 100644 --- a/application/single_app/functions_public_workspaces.py +++ b/application/single_app/functions_public_workspaces.py @@ -333,4 +333,85 @@ def get_user_visible_public_workspace_docs(user_id: str) -> list: if ws["id"] in visible_workspace_ids ] - return visible_workspaces \ No newline at end of file + return visible_workspaces + + +def check_public_workspace_status_allows_operation(workspace_doc, operation_type): + """ + Check if the public workspace's status allows the specified operation. + + Args: + workspace_doc: The public workspace document from Cosmos DB + operation_type: One of 'upload', 'delete', 'chat', 'view' + + Returns: + tuple: (allowed: bool, reason: str) + + Status definitions: + - active: All operations allowed + - locked: Read-only mode (view and chat only, no modifications) + - upload_disabled: No new uploads, but deletions and chat allowed + - inactive: No operations allowed except admin viewing + """ + if not workspace_doc: + return False, "Public workspace not found" + + status = workspace_doc.get('status', 'active') # Default to 'active' if not set + + # Define what each status allows + status_permissions = { + 'active': { + 'upload': True, + 'delete': True, + 'chat': True, + 'view': True + }, + 'locked': { + 'upload': False, + 'delete': False, + 'chat': True, + 'view': True + }, + 'upload_disabled': { + 'upload': False, + 'delete': True, + 'chat': True, + 'view': True + }, + 'inactive': { + 'upload': False, + 'delete': False, + 'chat': False, + 'view': False + } + } + + # Get permissions for current status + permissions = status_permissions.get(status, status_permissions['active']) + + # Check if operation is allowed + allowed = permissions.get(operation_type, False) + + # Generate helpful reason message if not allowed + if not allowed: + reasons = { + 'locked': { + 'upload': 'This public workspace is locked (read-only mode). Document uploads are disabled.', + 'delete': 'This public workspace is locked (read-only mode). Document deletions are disabled.' + }, + 'upload_disabled': { + 'upload': 'Document uploads are disabled for this public workspace.' + }, + 'inactive': { + 'upload': 'This public workspace is inactive. All operations are disabled.', + 'delete': 'This public workspace is inactive. All operations are disabled.', + 'chat': 'This public workspace is inactive. All operations are disabled.', + 'view': 'This public workspace is inactive. Access is restricted to administrators.' + } + } + + reason = reasons.get(status, {}).get(operation_type, + f'This operation is not allowed when public workspace status is "{status}".') + return False, reason + + return True, "" \ No newline at end of file diff --git a/application/single_app/functions_retention_policy.py b/application/single_app/functions_retention_policy.py new file mode 100644 index 00000000..02ba2eea --- /dev/null +++ b/application/single_app/functions_retention_policy.py @@ -0,0 +1,738 @@ +# functions_retention_policy.py + +""" +Retention Policy Management + +This module handles automated deletion of aged conversations and documents +based on configurable retention policies for personal, group, and public workspaces. + +Version: 0.234.067 +Implemented in: 0.234.067 +""" + +from config import * +from functions_settings import get_settings, update_settings, cosmos_user_settings_container +from functions_group import get_user_groups, cosmos_groups_container +from functions_public_workspaces import get_user_public_workspaces, cosmos_public_workspaces_container +from functions_documents import delete_document, delete_document_chunks +from functions_activity_logging import log_conversation_deletion, log_conversation_archival +from functions_notifications import create_notification, create_group_notification, create_public_workspace_notification +from functions_debug import debug_print +from datetime import datetime, timezone, timedelta + + +def get_all_user_settings(): + """ + Get all user settings from Cosmos DB. + + Returns: + list: List of all user setting documents + """ + try: + query = "SELECT * FROM c" + users = list(cosmos_user_settings_container.query_items( + query=query, + enable_cross_partition_query=True + )) + return users + except Exception as e: + debug_print(f"Error fetching all user settings: {e}") + return [] + + +def get_all_groups(): + """ + Get all groups from Cosmos DB. + + Returns: + list: List of all group documents + """ + try: + query = "SELECT * FROM c" + groups = list(cosmos_groups_container.query_items( + query=query, + enable_cross_partition_query=True + )) + return groups + except Exception as e: + debug_print(f"Error fetching all groups: {e}") + return [] + + +def get_all_public_workspaces(): + """ + Get all public workspaces from Cosmos DB. + + Returns: + list: List of all public workspace documents + """ + try: + query = "SELECT * FROM c" + workspaces = list(cosmos_public_workspaces_container.query_items( + query=query, + enable_cross_partition_query=True + )) + return workspaces + except Exception as e: + debug_print(f"Error fetching all public workspaces: {e}") + return [] + + +def execute_retention_policy(workspace_scopes=None, manual_execution=False): + """ + Execute retention policy for specified workspace scopes. + + Args: + workspace_scopes (list, optional): List of workspace types to process. + Can include 'personal', 'group', 'public'. If None, processes all enabled scopes. + manual_execution (bool): Whether this is a manual execution (bypasses schedule check) + + Returns: + dict: Summary of deletion results + """ + settings = get_settings() + + # Determine which scopes to process + if workspace_scopes is None: + workspace_scopes = [] + if settings.get('enable_retention_policy_personal', False): + workspace_scopes.append('personal') + if settings.get('enable_retention_policy_group', False): + workspace_scopes.append('group') + if settings.get('enable_retention_policy_public', False): + workspace_scopes.append('public') + + if not workspace_scopes: + debug_print("No retention policy scopes enabled") + return { + 'success': False, + 'message': 'No retention policy scopes enabled', + 'scopes_processed': [] + } + + results = { + 'success': True, + 'execution_time': datetime.now(timezone.utc).isoformat(), + 'manual_execution': manual_execution, + 'scopes_processed': workspace_scopes, + 'personal': {'conversations': 0, 'documents': 0, 'users_affected': 0}, + 'group': {'conversations': 0, 'documents': 0, 'workspaces_affected': 0}, + 'public': {'conversations': 0, 'documents': 0, 'workspaces_affected': 0}, + 'errors': [] + } + + try: + # Process personal workspaces + if 'personal' in workspace_scopes: + debug_print("Processing personal workspace retention policies...") + personal_results = process_personal_retention() + results['personal'] = personal_results + + # Process group workspaces + if 'group' in workspace_scopes: + debug_print("Processing group workspace retention policies...") + group_results = process_group_retention() + results['group'] = group_results + + # Process public workspaces + if 'public' in workspace_scopes: + debug_print("Processing public workspace retention policies...") + public_results = process_public_retention() + results['public'] = public_results + + # Update last run time in settings + settings['retention_policy_last_run'] = datetime.now(timezone.utc).isoformat() + + # Calculate next run time (scheduled for configured hour next day) + execution_hour = settings.get('retention_policy_execution_hour', 2) + next_run = datetime.now(timezone.utc).replace(hour=execution_hour, minute=0, second=0, microsecond=0) + if next_run <= datetime.now(timezone.utc): + next_run += timedelta(days=1) + settings['retention_policy_next_run'] = next_run.isoformat() + + update_settings(settings) + + debug_print(f"Retention policy execution completed: {results}") + return results + + except Exception as e: + debug_print(f"Error executing retention policy: {e}") + results['success'] = False + results['errors'].append(str(e)) + return results + + +def process_personal_retention(): + """ + Process retention policies for all personal workspaces. + + Returns: + dict: Deletion statistics + """ + results = { + 'conversations': 0, + 'documents': 0, + 'users_affected': 0, + 'details': [] + } + + try: + # Get all user settings + all_users = get_all_user_settings() + + for user in all_users: + user_id = user.get('id') + if not user_id: + continue + + # Get user's retention settings + user_settings = user.get('settings', {}) + retention_settings = user_settings.get('retention_policy', {}) + + conversation_retention_days = retention_settings.get('conversation_retention_days', 'none') + document_retention_days = retention_settings.get('document_retention_days', 'none') + + # Skip if both are set to "none" + if conversation_retention_days == 'none' and document_retention_days == 'none': + continue + + user_deletion_summary = { + 'user_id': user_id, + 'conversations_deleted': 0, + 'documents_deleted': 0, + 'conversation_details': [], + 'document_details': [] + } + + # Process conversations + if conversation_retention_days != 'none': + try: + conv_results = delete_aged_conversations( + user_id=user_id, + retention_days=int(conversation_retention_days), + workspace_type='personal' + ) + user_deletion_summary['conversations_deleted'] = conv_results['count'] + user_deletion_summary['conversation_details'] = conv_results['details'] + results['conversations'] += conv_results['count'] + except Exception as e: + debug_print(f"Error processing conversations for user {user_id}: {e}") + + # Process documents + if document_retention_days != 'none': + try: + doc_results = delete_aged_documents( + user_id=user_id, + retention_days=int(document_retention_days), + workspace_type='personal' + ) + user_deletion_summary['documents_deleted'] = doc_results['count'] + user_deletion_summary['document_details'] = doc_results['details'] + results['documents'] += doc_results['count'] + except Exception as e: + debug_print(f"Error processing documents for user {user_id}: {e}") + + # Send notification if anything was deleted + if user_deletion_summary['conversations_deleted'] > 0 or user_deletion_summary['documents_deleted'] > 0: + send_retention_notification(user_id, user_deletion_summary, 'personal') + results['users_affected'] += 1 + results['details'].append(user_deletion_summary) + + return results + + except Exception as e: + debug_print(f"Error in process_personal_retention: {e}") + return results + + +def process_group_retention(): + """ + Process retention policies for all group workspaces. + + Returns: + dict: Deletion statistics + """ + results = { + 'conversations': 0, + 'documents': 0, + 'workspaces_affected': 0, + 'details': [] + } + + try: + # Get all groups + all_groups = get_all_groups() + + for group in all_groups: + group_id = group.get('id') + if not group_id: + continue + + # Get group's retention settings + retention_settings = group.get('retention_policy', {}) + + conversation_retention_days = retention_settings.get('conversation_retention_days', 'none') + document_retention_days = retention_settings.get('document_retention_days', 'none') + + # Skip if both are set to "none" + if conversation_retention_days == 'none' and document_retention_days == 'none': + continue + + group_deletion_summary = { + 'group_id': group_id, + 'group_name': group.get('name', 'Unnamed Group'), + 'conversations_deleted': 0, + 'documents_deleted': 0, + 'conversation_details': [], + 'document_details': [] + } + + # Process conversations + if conversation_retention_days != 'none': + try: + conv_results = delete_aged_conversations( + group_id=group_id, + retention_days=int(conversation_retention_days), + workspace_type='group' + ) + group_deletion_summary['conversations_deleted'] = conv_results['count'] + group_deletion_summary['conversation_details'] = conv_results['details'] + results['conversations'] += conv_results['count'] + except Exception as e: + debug_print(f"Error processing conversations for group {group_id}: {e}") + + # Process documents + if document_retention_days != 'none': + try: + doc_results = delete_aged_documents( + group_id=group_id, + retention_days=int(document_retention_days), + workspace_type='group' + ) + group_deletion_summary['documents_deleted'] = doc_results['count'] + group_deletion_summary['document_details'] = doc_results['details'] + results['documents'] += doc_results['count'] + except Exception as e: + debug_print(f"Error processing documents for group {group_id}: {e}") + + # Send notification if anything was deleted + if group_deletion_summary['conversations_deleted'] > 0 or group_deletion_summary['documents_deleted'] > 0: + send_retention_notification(group_id, group_deletion_summary, 'group') + results['workspaces_affected'] += 1 + results['details'].append(group_deletion_summary) + + return results + + except Exception as e: + debug_print(f"Error in process_group_retention: {e}") + return results + + +def process_public_retention(): + """ + Process retention policies for all public workspaces. + + Returns: + dict: Deletion statistics + """ + results = { + 'conversations': 0, + 'documents': 0, + 'workspaces_affected': 0, + 'details': [] + } + + try: + # Get all public workspaces + all_workspaces = get_all_public_workspaces() + + for workspace in all_workspaces: + workspace_id = workspace.get('id') + if not workspace_id: + continue + + # Get workspace's retention settings + retention_settings = workspace.get('retention_policy', {}) + + conversation_retention_days = retention_settings.get('conversation_retention_days', 'none') + document_retention_days = retention_settings.get('document_retention_days', 'none') + + # Skip if both are set to "none" + if conversation_retention_days == 'none' and document_retention_days == 'none': + continue + + workspace_deletion_summary = { + 'public_workspace_id': workspace_id, + 'workspace_name': workspace.get('name', 'Unnamed Workspace'), + 'conversations_deleted': 0, + 'documents_deleted': 0, + 'conversation_details': [], + 'document_details': [] + } + + # Process conversations + if conversation_retention_days != 'none': + try: + conv_results = delete_aged_conversations( + public_workspace_id=workspace_id, + retention_days=int(conversation_retention_days), + workspace_type='public' + ) + workspace_deletion_summary['conversations_deleted'] = conv_results['count'] + workspace_deletion_summary['conversation_details'] = conv_results['details'] + results['conversations'] += conv_results['count'] + except Exception as e: + debug_print(f"Error processing conversations for public workspace {workspace_id}: {e}") + + # Process documents + if document_retention_days != 'none': + try: + doc_results = delete_aged_documents( + public_workspace_id=workspace_id, + retention_days=int(document_retention_days), + workspace_type='public' + ) + workspace_deletion_summary['documents_deleted'] = doc_results['count'] + workspace_deletion_summary['document_details'] = doc_results['details'] + results['documents'] += doc_results['count'] + except Exception as e: + debug_print(f"Error processing documents for public workspace {workspace_id}: {e}") + + # Send notification if anything was deleted + if workspace_deletion_summary['conversations_deleted'] > 0 or workspace_deletion_summary['documents_deleted'] > 0: + send_retention_notification(workspace_id, workspace_deletion_summary, 'public') + results['workspaces_affected'] += 1 + results['details'].append(workspace_deletion_summary) + + return results + + except Exception as e: + debug_print(f"Error in process_public_retention: {e}") + return results + + +def delete_aged_conversations(retention_days, workspace_type='personal', user_id=None, group_id=None, public_workspace_id=None): + """ + Delete conversations that exceed the retention period based on last_activity_at. + + Args: + retention_days (int): Number of days to retain conversations + workspace_type (str): 'personal', 'group', or 'public' + user_id (str, optional): User ID for personal workspaces + group_id (str, optional): Group ID for group workspaces + public_workspace_id (str, optional): Public workspace ID for public workspaces + + Returns: + dict: {'count': int, 'details': list} + """ + settings = get_settings() + archiving_enabled = settings.get('enable_conversation_archiving', False) + + # Determine which container to use + if workspace_type == 'group': + container = cosmos_group_conversations_container + partition_field = 'group_id' + partition_value = group_id + elif workspace_type == 'public': + container = cosmos_public_conversations_container + partition_field = 'public_workspace_id' + partition_value = public_workspace_id + else: + container = cosmos_conversations_container + partition_field = 'user_id' + partition_value = user_id + + # Calculate cutoff date + cutoff_date = datetime.now(timezone.utc) - timedelta(days=retention_days) + cutoff_iso = cutoff_date.isoformat() + + # Query for aged conversations + query = f""" + SELECT c.id, c.title, c.last_activity_at, c.{partition_field} + FROM c + WHERE c.{partition_field} = @partition_value + AND (c.last_activity_at < @cutoff_date OR IS_NULL(c.last_activity_at)) + """ + + parameters = [ + {"name": "@partition_value", "value": partition_value}, + {"name": "@cutoff_date", "value": cutoff_iso} + ] + + aged_conversations = list(container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + deleted_details = [] + + for conv in aged_conversations: + conversation_id = conv.get('id') + conversation_title = conv.get('title', 'Untitled') + + try: + # Read full conversation for archiving/logging + conversation_item = container.read_item( + item=conversation_id, + partition_key=conversation_id + ) + + # Archive if enabled + if archiving_enabled: + archived_item = dict(conversation_item) + archived_item["archived_at"] = datetime.now(timezone.utc).isoformat() + archived_item["archived_by_retention_policy"] = True + cosmos_archived_conversations_container.upsert_item(archived_item) + + log_conversation_archival( + user_id=conversation_item.get('user_id'), + conversation_id=conversation_id, + title=conversation_title, + workspace_type=workspace_type, + context=conversation_item.get('context', []), + tags=conversation_item.get('tags', []), + group_id=conversation_item.get('group_id'), + public_workspace_id=conversation_item.get('public_workspace_id') + ) + + # Delete messages + + if workspace_type == 'group': + messages_container = cosmos_group_messages_container + elif workspace_type == 'public': + messages_container = cosmos_public_messages_container + else: + messages_container = cosmos_messages_container + + message_query = f"SELECT * FROM c WHERE c.conversation_id = @conversation_id" + message_params = [{"name": "@conversation_id", "value": conversation_id}] + + messages = list(messages_container.query_items( + query=message_query, + parameters=message_params, + partition_key=conversation_id + )) + + for msg in messages: + if archiving_enabled: + archived_msg = dict(msg) + archived_msg["archived_at"] = datetime.now(timezone.utc).isoformat() + archived_msg["archived_by_retention_policy"] = True + cosmos_archived_messages_container.upsert_item(archived_msg) + + messages_container.delete_item(msg['id'], partition_key=conversation_id) + + # Log deletion + log_conversation_deletion( + user_id=conversation_item.get('user_id'), + conversation_id=conversation_id, + title=conversation_title, + workspace_type=workspace_type, + context=conversation_item.get('context', []), + tags=conversation_item.get('tags', []), + is_archived=archiving_enabled, + is_bulk_operation=True, + group_id=conversation_item.get('group_id'), + public_workspace_id=conversation_item.get('public_workspace_id'), + deletion_reason='retention_policy' + ) + + # Delete conversation + container.delete_item( + item=conversation_id, + partition_key=conversation_id + ) + + deleted_details.append({ + 'id': conversation_id, + 'title': conversation_title, + 'last_activity_at': conv.get('last_activity_at') + }) + + debug_print(f"Deleted conversation {conversation_id} ({conversation_title}) due to retention policy") + + except Exception as e: + debug_print(f"Error deleting conversation {conversation_id}: {e}") + + return { + 'count': len(deleted_details), + 'details': deleted_details + } + + +def delete_aged_documents(retention_days, workspace_type='personal', user_id=None, group_id=None, public_workspace_id=None): + """ + Delete documents that exceed the retention period based on last_activity_at. + + Args: + retention_days (int): Number of days to retain documents + workspace_type (str): 'personal', 'group', or 'public' + user_id (str, optional): User ID for personal workspaces + group_id (str, optional): Group ID for group workspaces + public_workspace_id (str, optional): Public workspace ID for public workspaces + + Returns: + dict: {'count': int, 'details': list} + """ + # Determine which container to use + if workspace_type == 'group': + container = cosmos_group_documents_container + partition_field = 'group_id' + partition_value = group_id + deletion_user_id = None # Will be extracted from document + elif workspace_type == 'public': + container = cosmos_public_documents_container + partition_field = 'public_workspace_id' + partition_value = public_workspace_id + deletion_user_id = None # Will be extracted from document + else: + container = cosmos_user_documents_container + partition_field = 'user_id' + partition_value = user_id + deletion_user_id = user_id + + # Calculate cutoff date + cutoff_date = datetime.now(timezone.utc) - timedelta(days=retention_days) + cutoff_iso = cutoff_date.isoformat() + + # Query for aged documents + query = f""" + SELECT c.id, c.file_name, c.title, c.last_activity_at, c.{partition_field}, c.user_id + FROM c + WHERE c.{partition_field} = @partition_value + AND (c.last_activity_at < @cutoff_date OR IS_NULL(c.last_activity_at)) + """ + + parameters = [ + {"name": "@partition_value", "value": partition_value}, + {"name": "@cutoff_date", "value": cutoff_iso} + ] + + aged_documents = list(container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + deleted_details = [] + + for doc in aged_documents: + document_id = doc.get('id') + file_name = doc.get('file_name', 'Unknown') + title = doc.get('title', file_name) + doc_user_id = doc.get('user_id') or deletion_user_id + + try: + # Delete document chunks from search index + delete_document_chunks(document_id, group_id, public_workspace_id) + + # Delete document from Cosmos DB and blob storage + delete_document(doc_user_id, document_id, group_id, public_workspace_id) + + deleted_details.append({ + 'id': document_id, + 'file_name': file_name, + 'title': title, + 'last_activity_at': doc.get('last_activity_at') + }) + + debug_print(f"Deleted document {document_id} ({file_name}) due to retention policy") + + except Exception as e: + debug_print(f"Error deleting document {document_id}: {e}") + + return { + 'count': len(deleted_details), + 'details': deleted_details + } + + +def send_retention_notification(workspace_id, deletion_summary, workspace_type): + """ + Send notification about retention policy deletions. + + Args: + workspace_id (str): User ID, group ID, or public workspace ID + deletion_summary (dict): Summary of deletions + workspace_type (str): 'personal', 'group', or 'public' + """ + conversations_deleted = deletion_summary.get('conversations_deleted', 0) + documents_deleted = deletion_summary.get('documents_deleted', 0) + + # Build message + message_parts = [] + if conversations_deleted > 0: + message_parts.append(f"{conversations_deleted} conversation{'s' if conversations_deleted != 1 else ''}") + if documents_deleted > 0: + message_parts.append(f"{documents_deleted} document{'s' if documents_deleted != 1 else ''}") + + message = f"Retention policy automatically deleted {' and '.join(message_parts)}." + + # Build details list + details = [] + + if conversations_deleted > 0: + conv_details = deletion_summary.get('conversation_details', []) + if conv_details: + details.append("**Conversations:**") + for conv in conv_details[:10]: # Limit to first 10 + details.append(f"• {conv.get('title', 'Untitled')}") + if len(conv_details) > 10: + details.append(f"• ...and {len(conv_details) - 10} more") + + if documents_deleted > 0: + doc_details = deletion_summary.get('document_details', []) + if doc_details: + details.append("\n**Documents:**") + for doc in doc_details[:10]: # Limit to first 10 + details.append(f"• {doc.get('file_name', 'Unknown')}") + if len(doc_details) > 10: + details.append(f"• ...and {len(doc_details) - 10} more") + + full_message = message + if details: + full_message += "\n\n" + "\n".join(details) + + # Create notification based on workspace type + if workspace_type == 'group': + create_group_notification( + group_id=workspace_id, + notification_type='system_announcement', + title='Retention Policy Cleanup', + message=full_message, + link_url='/chat', + metadata={ + 'conversations_deleted': conversations_deleted, + 'documents_deleted': documents_deleted, + 'deletion_date': datetime.now(timezone.utc).isoformat() + } + ) + elif workspace_type == 'public': + create_public_workspace_notification( + public_workspace_id=workspace_id, + notification_type='system_announcement', + title='Retention Policy Cleanup', + message=full_message, + link_url='/chat', + metadata={ + 'conversations_deleted': conversations_deleted, + 'documents_deleted': documents_deleted, + 'deletion_date': datetime.now(timezone.utc).isoformat() + } + ) + else: # personal + create_notification( + user_id=workspace_id, + notification_type='system_announcement', + title='Retention Policy Cleanup', + message=full_message, + link_url='/chat', + metadata={ + 'conversations_deleted': conversations_deleted, + 'documents_deleted': documents_deleted, + 'deletion_date': datetime.now(timezone.utc).isoformat() + } + ) + + debug_print(f"Sent retention notification to {workspace_type} workspace {workspace_id}") diff --git a/application/single_app/functions_settings.py b/application/single_app/functions_settings.py index 7c43e71d..a0575f54 100644 --- a/application/single_app/functions_settings.py +++ b/application/single_app/functions_settings.py @@ -144,6 +144,10 @@ def get_settings(use_cosmos=False): 'number_of_historical_messages_to_summarize': 10, 'enable_summarize_content_history_beyond_conversation_history_limit': False, + # Multi-Modal Vision Analysis + 'enable_multimodal_vision': False, + 'multimodal_vision_model': '', + # Document Classification 'enable_document_classification': False, 'document_classification_categories': [ @@ -180,6 +184,7 @@ def get_settings(use_cosmos=False): 'enable_content_safety': False, 'require_member_of_safety_violation_admin': False, 'require_member_of_control_center_admin': False, + 'require_member_of_control_center_dashboard_reader': False, 'content_safety_endpoint': '', 'content_safety_key': '', 'content_safety_authentication_type': 'key', @@ -247,11 +252,30 @@ def get_settings(use_cosmos=False): "speech_service_location": '', "speech_service_locale": "en-US", "speech_service_key": "", + "speech_service_authentication_type": "key", # 'key' or 'managed_identity' + + # Speech-to-text chat input + "enable_speech_to_text_input": False, + + # Text-to-speech chat output + "enable_text_to_speech": False, #key vault settings 'enable_key_vault_secret_storage': False, 'key_vault_name': '', 'key_vault_identity': '', + + # Retention Policy Settings + 'enable_retention_policy_personal': False, + 'enable_retention_policy_group': False, + 'enable_retention_policy_public': False, + 'retention_policy_execution_hour': 2, # Run at 2 AM by default (0-23) + 'retention_policy_last_run': None, # ISO timestamp of last execution + 'retention_policy_next_run': None, # ISO timestamp of next scheduled execution + 'retention_conversation_min_days': 1, + 'retention_conversation_max_days': 3650, # ~10 years + 'retention_document_min_days': 1, + 'retention_document_max_days': 3650, # ~10 years } try: @@ -713,9 +737,36 @@ def sanitize_settings_for_user(full_settings: dict) -> dict: if "key" not in k.lower() and "storage_account_url" not in k.lower()} def sanitize_settings_for_logging(full_settings: dict) -> dict: - # Exclude any key containing "key", "base64", "storage_account_url" - return {k: v for k, v in full_settings.items() - if "key" not in k.lower() and "base64" not in k.lower() and "image" not in k.lower() and "storage_account_url" not in k.lower()} + """ + Recursively sanitize settings to remove sensitive data from debug logs. + Filters out keys containing: key, base64, image, storage_account_url + Also filters out values containing base64 data + """ + if not isinstance(full_settings, dict): + return full_settings + + sanitized = {} + sensitive_key_terms = ["key", "base64", "image", "storage_account_url"] + + for k, v in full_settings.items(): + # Skip keys with sensitive terms + if any(term in k.lower() for term in sensitive_key_terms): + sanitized[k] = "[REDACTED]" + continue + + # Check if value is a string containing base64 data + if isinstance(v, str) and ("base64," in v or len(v) > 500): + sanitized[k] = "[BASE64_DATA_REDACTED]" + # Recursively sanitize nested dicts + elif isinstance(v, dict): + sanitized[k] = sanitize_settings_for_logging(v) + # Recursively sanitize lists + elif isinstance(v, list): + sanitized[k] = [sanitize_settings_for_logging(item) if isinstance(item, dict) else item for item in v] + else: + sanitized[k] = v + + return sanitized # Search history management functions def get_user_search_history(user_id): diff --git a/application/single_app/requirements.txt b/application/single_app/requirements.txt index a5467d9a..187b4a98 100644 --- a/application/single_app/requirements.txt +++ b/application/single_app/requirements.txt @@ -1,11 +1,10 @@ # requirements.txt pandas==2.2.3 azure-monitor-query==1.4.1 -opencensus-ext-azure==1.1.15 Flask==2.2.5 Flask-WTF==1.2.1 gunicorn -Werkzeug==3.0.6 +Werkzeug==3.1.4 requests==2.32.4 openai==1.67 docx2txt==0.8 @@ -31,7 +30,7 @@ azure-ai-contentsafety==1.0.0 azure-storage-blob==12.24.1 azure-storage-queue==12.12.0 azure-keyvault-secrets==4.10.0 -pypdf==6.1.3 +pypdf==6.4.0 python-docx==1.1.2 flask-executor==1.0.0 PyMuPDF==1.25.3 diff --git a/application/single_app/route_backend_agents.py b/application/single_app/route_backend_agents.py index d2e812b1..5032ebec 100644 --- a/application/single_app/route_backend_agents.py +++ b/application/single_app/route_backend_agents.py @@ -17,6 +17,7 @@ delete_group_agent, validate_group_agent_payload, ) +from functions_debug import debug_print from functions_authentication import * from functions_appinsights import log_event from json_schema_validation import validate_agent @@ -266,7 +267,7 @@ def create_group_agent_route(): try: saved = save_group_agent(active_group, payload) except Exception as exc: - current_app.logger.error('Failed to save group agent: %s', exc) + debug_print('Failed to save group agent: %s', exc) return jsonify({'error': 'Unable to save agent'}), 500 return jsonify(saved), 201 @@ -314,7 +315,7 @@ def update_group_agent_route(agent_id): try: saved = save_group_agent(active_group, merged) except Exception as exc: - current_app.logger.error('Failed to update group agent %s: %s', agent_id, exc) + debug_print('Failed to update group agent %s: %s', agent_id, exc) return jsonify({'error': 'Unable to update agent'}), 500 return jsonify(saved), 200 @@ -340,7 +341,7 @@ def delete_group_agent_route(agent_id): try: removed = delete_group_agent(active_group, agent_id) except Exception as exc: - current_app.logger.error('Failed to delete group agent %s: %s', agent_id, exc) + debug_print('Failed to delete group agent %s: %s', agent_id, exc) return jsonify({'error': 'Unable to delete agent'}), 500 if not removed: diff --git a/application/single_app/route_backend_chats.py b/application/single_app/route_backend_chats.py index ee5e144b..27c30e9c 100644 --- a/application/single_app/route_backend_chats.py +++ b/application/single_app/route_backend_chats.py @@ -38,9 +38,7 @@ def get_kernel_agents(): def register_route_backend_chats(app): @app.route('/api/chat', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def chat_api(): @@ -450,12 +448,22 @@ def result_requires_message_reload(result: Any) -> bool: group_doc = find_group_by_id(active_group_id) debug_print(f"Workspace search group lookup result: {group_doc}") - if group_doc and group_doc.get('name'): - group_name = group_doc.get('name') - user_metadata['workspace_search']['group_name'] = group_name - debug_print(f"Workspace search - set group_name to: {group_name}") + if group_doc: + # Check if group status allows chat operations + from functions_group import check_group_status_allows_operation + allowed, reason = check_group_status_allows_operation(group_doc, 'chat') + if not allowed: + return jsonify({'error': reason}), 403 + + if group_doc.get('name'): + group_name = group_doc.get('name') + user_metadata['workspace_search']['group_name'] = group_name + debug_print(f"Workspace search - set group_name to: {group_name}") + else: + debug_print(f"Workspace search - no name for group: {active_group_id}") + user_metadata['workspace_search']['group_name'] = None else: - debug_print(f"Workspace search - no group found or no name for id: {active_group_id}") + debug_print(f"Workspace search - no group found for id: {active_group_id}") user_metadata['workspace_search']['group_name'] = None except Exception as e: @@ -465,6 +473,17 @@ def result_requires_message_reload(result: Any) -> bool: traceback.print_exc() if document_scope == 'public' and active_public_workspace_id: + # Check if public workspace status allows chat operations + try: + from functions_public_workspaces import find_public_workspace_by_id, check_public_workspace_status_allows_operation + workspace_doc = find_public_workspace_by_id(active_public_workspace_id) + if workspace_doc: + allowed, reason = check_public_workspace_status_allows_operation(workspace_doc, 'chat') + if not allowed: + return jsonify({'error': reason}), 403 + except Exception as e: + debug_print(f"Error checking public workspace status: {e}") + user_metadata['workspace_search']['active_public_workspace_id'] = active_public_workspace_id else: user_metadata['workspace_search'] = { @@ -936,12 +955,12 @@ def result_requires_message_reload(result: Any) -> bool: continue processed_doc_ids.add(doc_id) - # Determine workspace type from the search result fields doc_user_id = doc.get('user_id') doc_group_id = doc.get('group_id') doc_public_workspace_id = doc.get('public_workspace_id') + # Query Cosmos for this document's metadata metadata = get_document_metadata_for_citations( document_id=doc_id, @@ -950,17 +969,20 @@ def result_requires_message_reload(result: Any) -> bool: public_workspace_id=doc_public_workspace_id if doc_public_workspace_id else None ) + # If we have metadata with content, create additional citations if metadata: file_name = metadata.get('file_name', 'Unknown') keywords = metadata.get('keywords', []) abstract = metadata.get('abstract', '') + # Create citation for keywords if they exist if keywords and len(keywords) > 0: keywords_text = ', '.join(keywords) if isinstance(keywords, list) else str(keywords) keywords_citation_id = f"{doc_id}_keywords" + keywords_citation = { "file_name": file_name, "citation_id": keywords_citation_id, @@ -985,6 +1007,15 @@ def result_requires_message_reload(result: Any) -> bool: if abstract and len(abstract.strip()) > 0: abstract_citation_id = f"{doc_id}_abstract" + + # Add keywords to retrieved content for the model + keywords_context = f"Document Keywords ({file_name}): {keywords_text}" + retrieved_texts.append(keywords_context) + + # Create citation for abstract if it exists + if abstract and len(abstract.strip()) > 0: + abstract_citation_id = f"{doc_id}_abstract" + abstract_citation = { "file_name": file_name, "citation_id": abstract_citation_id, @@ -1005,6 +1036,11 @@ def result_requires_message_reload(result: Any) -> bool: abstract_context = f"Document Abstract ({file_name}): {abstract}" retrieved_texts.append(abstract_context) + + # Add abstract to retrieved content for the model + abstract_context = f"Document Abstract ({file_name}): {abstract}" + retrieved_texts.append(abstract_context) + # Create citation for vision analysis if it exists vision_analysis = metadata.get('vision_analysis') if vision_analysis: @@ -1043,6 +1079,7 @@ def result_requires_message_reload(result: Any) -> bool: vision_context = f"AI Vision Analysis ({file_name}): {vision_content}" retrieved_texts.append(vision_context) + # Update the system prompt with the enhanced content including metadata if retrieved_texts: retrieved_content = "\n\n".join(retrieved_texts) @@ -1050,6 +1087,12 @@ def result_requires_message_reload(result: Any) -> bool: Retrieved Excerpts: {retrieved_content} Based *only* on the information provided above, answer the user's query. If the answer isn't in the excerpts, say so. + + Retrieved Excerpts: + {retrieved_content} + + Based *only* on the information provided above, answer the user's query. If the answer isn't in the excerpts, say so. + Example User: What is the policy on double dipping? Assistant: The policy prohibits entities from using federal funds received through one program to apply for additional funds through another program, commonly known as 'double dipping' (Source: PolicyDocument.pdf, Page: 12) @@ -2571,16 +2614,20 @@ def generate(): try: user_settings_obj = get_user_settings(user_id) debug_print(f"[DEBUG] user_settings_obj type: {type(user_settings_obj)}") - debug_print(f"[DEBUG] user_settings_obj: {user_settings_obj}") + # Sanitize user_settings_obj to remove sensitive data (keys, base64, images) from debug logs + sanitized_settings = sanitize_settings_for_logging(user_settings_obj) if isinstance(user_settings_obj, dict) else user_settings_obj + debug_print(f"[DEBUG] user_settings_obj (sanitized): {sanitized_settings}") # user_settings_obj might be nested with 'settings' key if isinstance(user_settings_obj, dict): if 'settings' in user_settings_obj: user_settings = user_settings_obj['settings'] - debug_print(f"[DEBUG] Extracted user_settings from 'settings' key: {user_settings}") + sanitized_user_settings = sanitize_settings_for_logging(user_settings) if isinstance(user_settings, dict) else user_settings + debug_print(f"[DEBUG] Extracted user_settings from 'settings' key (sanitized): {sanitized_user_settings}") else: user_settings = user_settings_obj - debug_print(f"[DEBUG] Using user_settings_obj directly: {user_settings}") + sanitized_user_settings = sanitize_settings_for_logging(user_settings) if isinstance(user_settings, dict) else user_settings + debug_print(f"[DEBUG] Using user_settings_obj directly (sanitized): {sanitized_user_settings}") user_enable_agents = user_settings.get('enable_agents', False) debug_print(f"[DEBUG] user_enable_agents={user_enable_agents}") @@ -2800,6 +2847,18 @@ def generate(): traceback.print_exc() if document_scope == 'public' and active_public_workspace_id: + # Check if public workspace status allows chat operations + try: + from functions_public_workspaces import find_public_workspace_by_id, check_public_workspace_status_allows_operation + workspace_doc = find_public_workspace_by_id(active_public_workspace_id) + if workspace_doc: + allowed, reason = check_public_workspace_status_allows_operation(workspace_doc, 'chat') + if not allowed: + yield f"data: {json.dumps({'error': reason})}\n\n" + return + except Exception as e: + debug_print(f"Error checking public workspace status: {e}") + user_metadata['workspace_search']['active_public_workspace_id'] = active_public_workspace_id else: user_metadata['workspace_search'] = { @@ -3489,6 +3548,41 @@ def make_json_serializable(obj): } cosmos_messages_container.upsert_item(assistant_doc) + # Log chat token usage to activity_logs for easy reporting + if token_usage_data and token_usage_data.get('total_tokens'): + try: + from functions_activity_logging import log_token_usage + + # Determine workspace type based on active group/public workspace + workspace_type = 'personal' + if active_public_workspace_id: + workspace_type = 'public' + elif active_group_id: + workspace_type = 'group' + + log_token_usage( + user_id=user_id, + token_type='chat', + total_tokens=token_usage_data.get('total_tokens'), + model=final_model_used if use_agent_streaming else gpt_model, + workspace_type=workspace_type, + prompt_tokens=token_usage_data.get('prompt_tokens'), + completion_tokens=token_usage_data.get('completion_tokens'), + conversation_id=conversation_id, + message_id=assistant_message_id, + group_id=active_group_id, + public_workspace_id=active_public_workspace_id, + additional_context={ + 'agent_name': agent_name_used if use_agent_streaming else None, + 'augmented': bool(system_messages_for_augmentation), + 'reasoning_effort': reasoning_effort + } + ) + debug_print(f"✅ Logged streaming chat token usage: {token_usage_data.get('total_tokens')} tokens") + except Exception as log_error: + debug_print(f"⚠️ Warning: Failed to log streaming chat token usage: {log_error}") + # Don't fail the chat flow if logging fails + # Update conversation conversation_item['last_updated'] = datetime.utcnow().isoformat() @@ -3591,9 +3685,7 @@ def make_json_serializable(obj): ) @app.route('/api/message//mask', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def mask_message_api(message_id): diff --git a/application/single_app/route_backend_control_center.py b/application/single_app/route_backend_control_center.py index 8577e5d5..0e5bcc29 100644 --- a/application/single_app/route_backend_control_center.py +++ b/application/single_app/route_backend_control_center.py @@ -5,7 +5,10 @@ from functions_settings import * from functions_logging import * from functions_activity_logging import * -from functions_documents import update_document +from functions_approvals import * +from functions_documents import update_document, delete_document, delete_document_chunks +from functions_group import delete_group +from utils_cache import invalidate_group_search_cache from swagger_wrapper import swagger_route, get_auth_security from datetime import datetime, timedelta, timezone import json @@ -98,7 +101,7 @@ def enhance_user_with_activity(user, force_refresh=False): cached_metrics = user.get('settings', {}).get('metrics') if cached_metrics and cached_metrics.get('calculated_at'): try: - current_app.logger.debug(f"Using cached metrics for user {user.get('id')}") + debug_print(f"Using cached metrics for user {user.get('id')}") # Use cached data regardless of age when not forcing refresh if 'login_metrics' in cached_metrics: enhanced['activity']['login_metrics'] = cached_metrics['login_metrics'] @@ -112,14 +115,14 @@ def enhance_user_with_activity(user, force_refresh=False): enhanced['activity']['document_metrics'] = cached_doc_metrics return enhanced except Exception as cache_e: - current_app.logger.debug(f"Error using cached metrics for user {user.get('id')}: {cache_e}") + debug_print(f"Error using cached metrics for user {user.get('id')}: {cache_e}") # If no cached metrics and not forcing refresh, return with default/empty metrics # Do NOT include enhanced_citation_enabled in user data - frontend gets it from app settings - current_app.logger.debug(f"No cached metrics for user {user.get('id')}, returning default values (use refresh button to calculate)") + debug_print(f"No cached metrics for user {user.get('id')}, returning default values (use refresh button to calculate)") return enhanced - current_app.logger.debug(f"Force refresh requested - calculating fresh metrics for user {user.get('id')}") + debug_print(f"Force refresh requested - calculating fresh metrics for user {user.get('id')}") # Try to get comprehensive conversation metrics @@ -214,10 +217,10 @@ def enhance_user_with_activity(user, force_refresh=False): batch_size = size_result[0] if size_result else 0 total_message_size += batch_size or 0 - current_app.logger.debug(f"Messages batch {i//batch_size + 1}: {batch_messages} messages, {batch_size or 0} bytes") + debug_print(f"Messages batch {i//batch_size + 1}: {batch_messages} messages, {batch_size or 0} bytes") except Exception as msg_e: - current_app.logger.error(f"Could not query message sizes for batch {i//batch_size + 1}: {msg_e}") + debug_print(f"Could not query message sizes for batch {i//batch_size + 1}: {msg_e}") # Try individual conversation queries as fallback for conv_id in batch_ids: try: @@ -250,15 +253,15 @@ def enhance_user_with_activity(user, force_refresh=False): total_message_size += size_result[0] if size_result and size_result[0] else 0 except Exception as individual_e: - current_app.logger.debug(f"Could not query individual conversation {conv_id}: {individual_e}") + debug_print(f"Could not query individual conversation {conv_id}: {individual_e}") continue enhanced['activity']['chat_metrics']['total_messages'] = total_messages enhanced['activity']['chat_metrics']['total_message_size'] = total_message_size - current_app.logger.debug(f"Final chat metrics for user {user.get('id')}: {total_messages} messages, {total_message_size} bytes") + debug_print(f"Final chat metrics for user {user.get('id')}: {total_messages} messages, {total_message_size} bytes") except Exception as e: - current_app.logger.debug(f"Could not get chat metrics for user {user.get('id')}: {e}") + debug_print(f"Could not get chat metrics for user {user.get('id')}: {e}") # Try to get comprehensive login metrics try: @@ -291,7 +294,7 @@ def enhance_user_with_activity(user, force_refresh=False): enhanced['activity']['login_metrics']['last_login'] = login_record.get('timestamp') or login_record.get('created_at') except Exception as e: - current_app.logger.debug(f"Could not get login metrics for user {user.get('id')}: {e}") + debug_print(f"Could not get login metrics for user {user.get('id')}: {e}") # Try to get comprehensive document metrics try: @@ -325,7 +328,7 @@ def enhance_user_with_activity(user, force_refresh=False): enhanced['activity']['document_metrics']['total_documents'] = total_docs # AI search size = pages × 80KB - enhanced['activity']['document_metrics']['ai_search_size'] = total_pages * 80 * 1024 # 80KB per page + enhanced['activity']['document_metrics']['ai_search_size'] = total_pages * 22 * 1024 # 22KB per page # Last day upload tracking removed - keeping only document count and sizes @@ -352,14 +355,14 @@ def enhance_user_with_activity(user, force_refresh=False): total_storage_size += blob.size blob_count += 1 debug_print(f"💾 [STORAGE DEBUG] Blob {blob.name}: {blob.size} bytes") - current_app.logger.debug(f"Storage blob {blob.name}: {blob.size} bytes") + debug_print(f"Storage blob {blob.name}: {blob.size} bytes") debug_print(f"💾 [STORAGE DEBUG] Found {blob_count} blobs, total size: {total_storage_size} bytes") enhanced['activity']['document_metrics']['storage_account_size'] = total_storage_size - current_app.logger.debug(f"Total storage size for user {user.get('id')}: {total_storage_size} bytes") + debug_print(f"Total storage size for user {user.get('id')}: {total_storage_size} bytes") else: debug_print(f"💾 [STORAGE DEBUG] Storage client NOT available for user {user.get('id')}") - current_app.logger.debug(f"Storage client not available for user {user.get('id')}") + debug_print(f"Storage client not available for user {user.get('id')}") # Fallback to estimation if storage client not available storage_size_query = """ SELECT c.file_name, c.number_of_pages FROM c @@ -394,16 +397,16 @@ def enhance_user_with_activity(user, force_refresh=False): enhanced['activity']['document_metrics']['storage_account_size'] = total_storage_size debug_print(f"💾 [STORAGE DEBUG] Fallback estimation complete: {total_storage_size} bytes") - current_app.logger.debug(f"Estimated storage size for user {user.get('id')}: {total_storage_size} bytes") + debug_print(f"Estimated storage size for user {user.get('id')}: {total_storage_size} bytes") except Exception as storage_e: debug_print(f"❌ [STORAGE DEBUG] Storage calculation failed for user {user.get('id')}: {storage_e}") - current_app.logger.debug(f"Could not calculate storage size for user {user.get('id')}: {storage_e}") + debug_print(f"Could not calculate storage size for user {user.get('id')}: {storage_e}") # Set to 0 if we can't calculate enhanced['activity']['document_metrics']['storage_account_size'] = 0 except Exception as e: - current_app.logger.debug(f"Could not get document metrics for user {user.get('id')}: {e}") + debug_print(f"Could not get document metrics for user {user.get('id')}: {e}") # Save calculated metrics to user settings for caching (only if we calculated fresh data) if force_refresh or not user.get('settings', {}).get('metrics', {}).get('calculated_at'): @@ -428,17 +431,17 @@ def enhance_user_with_activity(user, force_refresh=False): update_success = update_user_settings(user.get('id'), settings_update) if update_success: - current_app.logger.debug(f"Successfully cached metrics for user {user.get('id')}") + debug_print(f"Successfully cached metrics for user {user.get('id')}") else: - current_app.logger.debug(f"Failed to cache metrics for user {user.get('id')}") + debug_print(f"Failed to cache metrics for user {user.get('id')}") except Exception as cache_save_e: - current_app.logger.debug(f"Error saving metrics cache for user {user.get('id')}: {cache_save_e}") + debug_print(f"Error saving metrics cache for user {user.get('id')}: {cache_save_e}") return enhanced except Exception as e: - current_app.logger.error(f"Error enhancing user data: {e}") + debug_print(f"Error enhancing user data: {e}") return user # Return original user data if enhancement fails def enhance_public_workspace_with_activity(workspace, force_refresh=False): @@ -472,14 +475,16 @@ def enhance_public_workspace_with_activity(workspace, force_refresh=False): 'created_at': workspace.get('createdDate'), # Alias for frontend # Flat fields expected by frontend - 'owner_name': owner_info.get('display_name') or owner_info.get('name', 'Unknown'), + 'owner_name': owner_info.get('displayName') or owner_info.get('display_name') or owner_info.get('name', 'Unknown'), 'owner_email': owner_info.get('email', ''), - 'created_by': owner_info.get('display_name') or owner_info.get('name', 'Unknown'), + 'created_by': owner_info.get('displayName') or owner_info.get('display_name') or owner_info.get('name', 'Unknown'), 'document_count': 0, # Will be updated from database + 'member_count': len(workspace.get('admins', [])) + len(workspace.get('documentManagers', [])) + (1 if owner_info else 0), # Total members including owner 'storage_size': 0, # Will be updated from storage account 'last_activity': None, # Will be updated from public_documents 'recent_activity_count': 0, # Will be calculated - 'status': 'active', # default - can be determined by business logic + 'status': workspace.get('status', 'active'), # Read from workspace document, default to 'active' + 'statusHistory': workspace.get('statusHistory', []), # Include status change history # Keep nested structure for backward compatibility 'activity': { @@ -513,7 +518,12 @@ def enhance_public_workspace_with_activity(workspace, force_refresh=False): # Update flat fields enhanced['document_count'] = doc_metrics.get('total_documents', 0) enhanced['storage_size'] = doc_metrics.get('storage_account_size', 0) - # Cached document metrics applied successfully + + # Apply cached activity metrics if available + if 'last_activity' in cached_metrics: + enhanced['last_activity'] = cached_metrics['last_activity'] + if 'recent_activity_count' in cached_metrics: + enhanced['recent_activity_count'] = cached_metrics['recent_activity_count'] debug_print(f"🌐 [PUBLIC WORKSPACE DEBUG] Returning cached data for {workspace_id}: {enhanced['activity']['document_metrics']}") return enhanced @@ -586,7 +596,7 @@ def enhance_public_workspace_with_activity(workspace, force_refresh=False): )) total_pages = pages_sum_result[0] if pages_sum_result and pages_sum_result[0] else 0 - ai_search_size = total_pages * 80 * 1024 # 80KB per page + ai_search_size = total_pages * 22 * 1024 # 22KB per page enhanced['activity']['document_metrics']['ai_search_size'] = ai_search_size debug_print(f"📊 [PUBLIC WORKSPACE DOCUMENT DEBUG] Workspace {workspace_id}: {total_documents} documents, {total_pages} pages, {ai_search_size} AI search size") @@ -688,6 +698,8 @@ def enhance_public_workspace_with_activity(workspace, force_refresh=False): try: metrics_cache = { 'document_metrics': enhanced['activity']['document_metrics'], + 'last_activity': enhanced.get('last_activity'), + 'recent_activity_count': enhanced.get('recent_activity_count', 0), 'calculated_at': datetime.now(timezone.utc).isoformat() } @@ -702,7 +714,7 @@ def enhance_public_workspace_with_activity(workspace, force_refresh=False): return enhanced except Exception as e: - current_app.logger.error(f"Error enhancing public workspace data: {e}") + debug_print(f"Error enhancing public workspace data: {e}") return workspace # Return original workspace data if enhancement fails def enhance_group_with_activity(group, force_refresh=False): @@ -739,15 +751,16 @@ def enhance_group_with_activity(group, force_refresh=False): 'created_at': group.get('createdDate'), # Alias for frontend # Flat fields expected by frontend - 'owner_name': owner_info.get('display_name') or owner_info.get('name', 'Unknown'), + 'owner_name': owner_info.get('displayName') or owner_info.get('display_name') or owner_info.get('name', 'Unknown'), 'owner_email': owner_info.get('email', ''), - 'created_by': owner_info.get('display_name') or owner_info.get('name', 'Unknown'), - 'member_count': len(users_list) + (1 if owner_info else 0), + 'created_by': owner_info.get('displayName') or owner_info.get('display_name') or owner_info.get('name', 'Unknown'), + 'member_count': len(users_list), # Owner is already included in users_list 'document_count': 0, # Will be updated from database 'storage_size': 0, # Will be updated from storage account 'last_activity': None, # Will be updated from group_documents 'recent_activity_count': 0, # Will be calculated - 'status': 'active', # default - can be determined by business logic + 'status': group.get('status', 'active'), # Read from group document, default to 'active' + 'statusHistory': group.get('statusHistory', []), # Include status change history # Keep nested structure for backward compatibility 'activity': { @@ -757,7 +770,7 @@ def enhance_group_with_activity(group, force_refresh=False): 'storage_account_size': 0 # Actual file sizes from storage }, 'member_metrics': { - 'total_members': len(users_list) + (1 if owner_info else 0), + 'total_members': len(users_list), # Owner is already included in users_list 'admin_count': len(group.get('admins', [])), 'document_manager_count': len(group.get('documentManagers', [])), 'pending_count': len(group.get('pendingUsers', [])) @@ -853,11 +866,11 @@ def enhance_group_with_activity(group, force_refresh=False): enhanced['activity']['document_metrics']['total_documents'] = total_docs enhanced['document_count'] = total_docs # Update flat field - # AI search size = pages × 80KB - enhanced['activity']['document_metrics']['ai_search_size'] = total_pages * 80 * 1024 # 80KB per page + # AI search size = pages × 22KB + enhanced['activity']['document_metrics']['ai_search_size'] = total_pages * 22 * 1024 # 22KB per page debug_print(f"📄 [GROUP DOCUMENT DEBUG] Total documents for group {group_id}: {total_docs}") - debug_print(f"📊 [GROUP AI SEARCH DEBUG] Total pages for group {group_id}: {total_pages}, AI search size: {total_pages * 80 * 1024} bytes") + debug_print(f"📊 [GROUP AI SEARCH DEBUG] Total pages for group {group_id}: {total_pages}, AI search size: {total_pages * 22 * 1024} bytes") # Last day upload tracking removed - keeping only document count and sizes debug_print(f"� [GROUP DOCUMENT DEBUG] Document metrics calculation complete for group {group_id}") @@ -965,15 +978,15 @@ def enhance_group_with_activity(group, force_refresh=False): total_storage_size += blob.size blob_count += 1 debug_print(f"💾 [GROUP STORAGE DEBUG] Blob {blob.name}: {blob.size} bytes") - current_app.logger.debug(f"Group storage blob {blob.name}: {blob.size} bytes") + debug_print(f"Group storage blob {blob.name}: {blob.size} bytes") debug_print(f"💾 [GROUP STORAGE DEBUG] Found {blob_count} blobs, total size: {total_storage_size} bytes") enhanced['activity']['document_metrics']['storage_account_size'] = total_storage_size enhanced['storage_size'] = total_storage_size # Update flat field - current_app.logger.debug(f"Total storage size for group {group_id}: {total_storage_size} bytes") + debug_print(f"Total storage size for group {group_id}: {total_storage_size} bytes") else: debug_print(f"💾 [GROUP STORAGE DEBUG] Storage client NOT available for group {group_id}") - current_app.logger.debug(f"Storage client not available for group {group_id}") + debug_print(f"Storage client not available for group {group_id}") # Fallback to estimation if storage client not available storage_size_query = """ SELECT c.file_name, c.number_of_pages FROM c @@ -1009,11 +1022,11 @@ def enhance_group_with_activity(group, force_refresh=False): enhanced['activity']['document_metrics']['storage_account_size'] = total_storage_size enhanced['storage_size'] = total_storage_size # Update flat field debug_print(f"💾 [GROUP STORAGE DEBUG] Fallback estimation complete: {total_storage_size} bytes") - current_app.logger.debug(f"Estimated storage size for group {group_id}: {total_storage_size} bytes") + debug_print(f"Estimated storage size for group {group_id}: {total_storage_size} bytes") except Exception as storage_e: debug_print(f"❌ [GROUP STORAGE DEBUG] Storage calculation failed for group {group_id}: {storage_e}") - current_app.logger.debug(f"Could not calculate storage size for group {group_id}: {storage_e}") + debug_print(f"Could not calculate storage size for group {group_id}: {storage_e}") # Set to 0 if we can't calculate enhanced['activity']['document_metrics']['storage_account_size'] = 0 enhanced['storage_size'] = 0 @@ -1037,7 +1050,7 @@ def enhance_group_with_activity(group, force_refresh=False): return enhanced except Exception as e: - current_app.logger.error(f"Error enhancing group data: {e}") + debug_print(f"Error enhancing group data: {e}") return group # Return original group data if enhancement fails def get_activity_trends_data(start_date, end_date): @@ -1063,10 +1076,18 @@ def get_activity_trends_data(start_date, end_date): date_key = current_date.strftime('%Y-%m-%d') daily_data[date_key] = { 'date': date_key, - 'chats': 0, - 'personal_documents': 0, # Track personal documents separately - 'group_documents': 0, # Track group documents separately - 'public_documents': 0, # Track public documents separately + 'chats_created': 0, + 'chats_deleted': 0, + 'chats': 0, # Keep for backward compatibility + 'personal_documents_created': 0, + 'personal_documents_deleted': 0, + 'group_documents_created': 0, + 'group_documents_deleted': 0, + 'public_documents_created': 0, + 'public_documents_deleted': 0, + 'personal_documents': 0, # Keep for backward compatibility + 'group_documents': 0, # Keep for backward compatibility + 'public_documents': 0, # Keep for backward compatibility 'documents': 0, # Keep for backward compatibility 'logins': 0, 'total': 0 @@ -1083,12 +1104,11 @@ def get_activity_trends_data(start_date, end_date): debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Query parameters: {parameters}") - # Query 1: Get chat activity from conversations and messages containers + # Query 1: Get chat activity from activity logs (both creation and deletion) try: debug_print("🔍 [ACTIVITY TRENDS DEBUG] Querying conversations...") - # Count conversations using activity_logs container (conversation_creation activity_type) - # This uses permanent activity log records instead of querying the conversations container + # Count conversation creations conversations_query = """ SELECT c.timestamp, c.created_at FROM c @@ -1097,7 +1117,6 @@ def get_activity_trends_data(start_date, end_date): OR (c.created_at >= @start_date AND c.created_at <= @end_date)) """ - # Process conversations from activity logs conversations = list(cosmos_activity_logs_container.query_items( query=conversations_query, parameters=parameters, @@ -1107,7 +1126,6 @@ def get_activity_trends_data(start_date, end_date): debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Found {len(conversations)} conversation creation logs") for conv in conversations: - # Use timestamp or created_at from activity log timestamp = conv.get('timestamp') or conv.get('created_at') if timestamp: try: @@ -1118,19 +1136,52 @@ def get_activity_trends_data(start_date, end_date): date_key = conv_date.strftime('%Y-%m-%d') if date_key in daily_data: - daily_data[date_key]['chats'] += 1 + daily_data[date_key]['chats_created'] += 1 + daily_data[date_key]['chats'] += 1 # Keep total for backward compatibility + except Exception as e: + debug_print(f"Could not parse conversation timestamp {timestamp}: {e}") + + # Count conversation deletions + deletions_query = """ + SELECT c.timestamp, c.created_at + FROM c + WHERE c.activity_type = 'conversation_deletion' + AND ((c.timestamp >= @start_date AND c.timestamp <= @end_date) + OR (c.created_at >= @start_date AND c.created_at <= @end_date)) + """ + + deletions = list(cosmos_activity_logs_container.query_items( + query=deletions_query, + parameters=parameters, + enable_cross_partition_query=True + )) + + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Found {len(deletions)} conversation deletion logs") + + for deletion in deletions: + timestamp = deletion.get('timestamp') or deletion.get('created_at') + if timestamp: + try: + if isinstance(timestamp, str): + del_date = datetime.fromisoformat(timestamp.replace('Z', '+00:00') if 'Z' in timestamp else timestamp) + else: + del_date = timestamp + + date_key = del_date.strftime('%Y-%m-%d') + if date_key in daily_data: + daily_data[date_key]['chats_deleted'] += 1 except Exception as e: - current_app.logger.debug(f"Could not parse conversation timestamp {timestamp}: {e}") + debug_print(f"Could not parse deletion timestamp {timestamp}: {e}") except Exception as e: - current_app.logger.warning(f"Could not query conversation activity logs: {e}") + debug_print(f"Could not query conversation activity logs: {e}") print(f"❌ [ACTIVITY TRENDS DEBUG] Error querying chats: {e}") - # Query 2: Get document activity from activity_logs container (document_creation activity_type) - # This uses permanent activity log records and unified workspace tracking + # Query 2: Get document activity from activity_logs (both creation and deletion) try: debug_print("🔍 [ACTIVITY TRENDS DEBUG] Querying documents from activity logs...") + # Document creations documents_query = """ SELECT c.timestamp, c.created_at, c.workspace_type FROM c @@ -1139,7 +1190,6 @@ def get_activity_trends_data(start_date, end_date): OR (c.created_at >= @start_date AND c.created_at <= @end_date)) """ - # Query activity logs for all document types docs = list(cosmos_activity_logs_container.query_items( query=documents_query, parameters=parameters, @@ -1149,7 +1199,6 @@ def get_activity_trends_data(start_date, end_date): debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Found {len(docs)} document creation logs") for doc in docs: - # Use timestamp or created_at from activity log timestamp = doc.get('timestamp') or doc.get('created_at') workspace_type = doc.get('workspace_type', 'personal') @@ -1162,23 +1211,63 @@ def get_activity_trends_data(start_date, end_date): date_key = doc_date.strftime('%Y-%m-%d') if date_key in daily_data: - # Increment workspace-specific counter if workspace_type == 'group': + daily_data[date_key]['group_documents_created'] += 1 daily_data[date_key]['group_documents'] += 1 elif workspace_type == 'public': + daily_data[date_key]['public_documents_created'] += 1 daily_data[date_key]['public_documents'] += 1 else: + daily_data[date_key]['personal_documents_created'] += 1 daily_data[date_key]['personal_documents'] += 1 - # Keep total for backward compatibility daily_data[date_key]['documents'] += 1 except Exception as e: - current_app.logger.debug(f"Could not parse document timestamp {timestamp}: {e}") + debug_print(f"Could not parse document timestamp {timestamp}: {e}") + + # Document deletions + deletions_query = """ + SELECT c.timestamp, c.created_at, c.workspace_type + FROM c + WHERE c.activity_type = 'document_deletion' + AND ((c.timestamp >= @start_date AND c.timestamp <= @end_date) + OR (c.created_at >= @start_date AND c.created_at <= @end_date)) + """ + + doc_deletions = list(cosmos_activity_logs_container.query_items( + query=deletions_query, + parameters=parameters, + enable_cross_partition_query=True + )) + + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Found {len(doc_deletions)} document deletion logs") + + for doc in doc_deletions: + timestamp = doc.get('timestamp') or doc.get('created_at') + workspace_type = doc.get('workspace_type', 'personal') + + if timestamp: + try: + if isinstance(timestamp, str): + doc_date = datetime.fromisoformat(timestamp.replace('Z', '+00:00') if 'Z' in timestamp else timestamp) + else: + doc_date = timestamp + + date_key = doc_date.strftime('%Y-%m-%d') + if date_key in daily_data: + if workspace_type == 'group': + daily_data[date_key]['group_documents_deleted'] += 1 + elif workspace_type == 'public': + daily_data[date_key]['public_documents_deleted'] += 1 + else: + daily_data[date_key]['personal_documents_deleted'] += 1 + except Exception as e: + debug_print(f"Could not parse document deletion timestamp {timestamp}: {e}") - debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Total documents found: {len(docs)}") + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Total documents found: {len(docs)} created, {len(doc_deletions)} deleted") except Exception as e: - current_app.logger.warning(f"Could not query document activity logs: {e}") + debug_print(f"Could not query document activity logs: {e}") print(f"❌ [ACTIVITY TRENDS DEBUG] Error querying documents: {e}") # Query 3: Get login activity from activity_logs container @@ -1233,10 +1322,10 @@ def get_activity_trends_data(start_date, end_date): if date_key in daily_data: daily_data[date_key]['logins'] += 1 except Exception as e: - current_app.logger.debug(f"Could not parse login timestamp {timestamp}: {e}") + debug_print(f"Could not parse login timestamp {timestamp}: {e}") except Exception as e: - current_app.logger.warning(f"Could not query activity logs for login data: {e}") + debug_print(f"Could not query activity logs for login data: {e}") print(f"❌ [ACTIVITY TRENDS DEBUG] Error querying logins: {e}") # Query 4: Get token usage from activity_logs (token_usage activity_type) @@ -1286,12 +1375,12 @@ def get_activity_trends_data(start_date, end_date): if date_key in token_daily_data: token_daily_data[date_key][token_type] += token_count except Exception as e: - current_app.logger.debug(f"Could not parse token timestamp {timestamp}: {e}") + debug_print(f"Could not parse token timestamp {timestamp}: {e}") debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Token daily data: {token_daily_data}") except Exception as e: - current_app.logger.warning(f"Could not query activity logs for token usage: {e}") + debug_print(f"Could not query activity logs for token usage: {e}") print(f"❌ [ACTIVITY TRENDS DEBUG] Error querying tokens: {e}") # Initialize empty token data on error token_daily_data = {} @@ -1312,20 +1401,36 @@ def get_activity_trends_data(start_date, end_date): # Group by activity type for chart display result = { 'chats': {}, + 'chats_created': {}, + 'chats_deleted': {}, 'documents': {}, # Keep for backward compatibility - 'personal_documents': {}, # New: personal documents only - 'group_documents': {}, # New: group documents only - 'public_documents': {}, # New: public documents only + 'personal_documents': {}, # Keep for backward compatibility + 'group_documents': {}, # Keep for backward compatibility + 'public_documents': {}, # Keep for backward compatibility + 'personal_documents_created': {}, + 'personal_documents_deleted': {}, + 'group_documents_created': {}, + 'group_documents_deleted': {}, + 'public_documents_created': {}, + 'public_documents_deleted': {}, 'logins': {}, 'tokens': token_daily_data # Token usage by type (embedding, chat) } for date_key, data in daily_data.items(): result['chats'][date_key] = data['chats'] - result['documents'][date_key] = data['documents'] # Total for backward compatibility + result['chats_created'][date_key] = data['chats_created'] + result['chats_deleted'][date_key] = data['chats_deleted'] + result['documents'][date_key] = data['documents'] result['personal_documents'][date_key] = data['personal_documents'] result['group_documents'][date_key] = data['group_documents'] result['public_documents'][date_key] = data['public_documents'] + result['personal_documents_created'][date_key] = data['personal_documents_created'] + result['personal_documents_deleted'][date_key] = data['personal_documents_deleted'] + result['group_documents_created'][date_key] = data['group_documents_created'] + result['group_documents_deleted'][date_key] = data['group_documents_deleted'] + result['public_documents_created'][date_key] = data['public_documents_created'] + result['public_documents_deleted'][date_key] = data['public_documents_deleted'] result['logins'][date_key] = data['logins'] debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Final result: {result}") @@ -1333,7 +1438,7 @@ def get_activity_trends_data(start_date, end_date): return result except Exception as e: - current_app.logger.error(f"Error getting activity trends data: {e}") + debug_print(f"Error getting activity trends data: {e}") print(f"❌ [ACTIVITY TRENDS DEBUG] Fatal error: {e}") return { 'chats': {}, @@ -1934,7 +2039,7 @@ def get_document_storage_size(doc, cosmos_container, container_name, folder_pref return result except Exception as e: - current_app.logger.error(f"Error getting raw activity trends data: {e}") + debug_print(f"Error getting raw activity trends data: {e}") debug_print(f"❌ [RAW ACTIVITY DEBUG] Fatal error: {e}") return {} @@ -1945,8 +2050,7 @@ def register_route_backend_control_center(app): @app.route('/api/admin/control-center/users', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_get_all_users(): """ Get all users with their settings, activity data, and access status. @@ -2048,14 +2152,13 @@ def api_get_all_users(): }), 200 except Exception as e: - current_app.logger.error(f"Error getting users: {e}") + debug_print(f"Error getting users: {e}") return jsonify({'error': 'Failed to retrieve users'}), 500 @app.route('/api/admin/control-center/users//access', methods=['PATCH']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_update_user_access(user_id): """ Update user access permissions (allow/deny with optional time-based restriction). @@ -2105,14 +2208,13 @@ def api_update_user_access(user_id): return jsonify({'error': 'Failed to update user access'}), 500 except Exception as e: - current_app.logger.error(f"Error updating user access: {e}") + debug_print(f"Error updating user access: {e}") return jsonify({'error': 'Failed to update user access'}), 500 @app.route('/api/admin/control-center/users//file-uploads', methods=['PATCH']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_update_user_file_uploads(user_id): """ Update user file upload permissions (allow/deny with optional time-based restriction). @@ -2162,14 +2264,87 @@ def api_update_user_file_uploads(user_id): return jsonify({'error': 'Failed to update user file upload permissions'}), 500 except Exception as e: - current_app.logger.error(f"Error updating user file uploads: {e}") + debug_print(f"Error updating user file uploads: {e}") return jsonify({'error': 'Failed to update user file upload permissions'}), 500 + @app.route('/api/admin/control-center/users//delete-documents', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_delete_user_documents_admin(user_id): + """ + Create an approval request to delete all documents for a user. + Requires approval from another admin. + + Body: + reason (str): Explanation for deleting documents (required) + """ + try: + data = request.get_json() or {} + reason = data.get('reason', '').strip() + + if not reason: + return jsonify({'error': 'Reason is required for document deletion'}), 400 + + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) + + # Validate user exists by trying to get their data from Cosmos + try: + user_doc = cosmos_user_settings_container.read_item( + item=user_id, + partition_key=user_id + ) + user_email = user_doc.get('email', 'unknown') + user_name = user_doc.get('display_name', user_email) + except Exception: + return jsonify({'error': 'User not found'}), 404 + + # Create approval request using user_id as both group_id (for partition) and storing user_id in metadata + from functions_approvals import create_approval_request, TYPE_DELETE_USER_DOCUMENTS + approval = create_approval_request( + request_type=TYPE_DELETE_USER_DOCUMENTS, + group_id=user_id, # Using user_id as partition key for user-related approvals + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'user_id': user_id, + 'user_name': user_name, + 'user_email': user_email + } + ) + + # Log event + log_event("[ControlCenter] Delete User Documents Request Created", { + "admin_user": admin_email, + "user_id": user_id, + "user_email": user_email, + "approval_id": approval['id'], + "reason": reason + }) + + return jsonify({ + 'success': True, + 'message': 'Document deletion request created successfully. Awaiting approval from another admin.', + 'approval_id': approval['id'] + }), 200 + + except Exception as e: + debug_print(f"Error creating user document deletion request: {e}") + log_event("[ControlCenter] Delete User Documents Request Failed", { + "error": str(e), + "user_id": user_id + }) + return jsonify({'error': str(e)}), 500 + @app.route('/api/admin/control-center/users/bulk-action', methods=['POST']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_bulk_user_action(): """ Perform bulk actions on multiple users (access control, file upload control). @@ -2222,7 +2397,7 @@ def api_bulk_user_action(): else: failed_users.append(user_id) except Exception as e: - current_app.logger.error(f"Error updating user {user_id}: {e}") + debug_print(f"Error updating user {user_id}: {e}") failed_users.append(user_id) # Log admin action @@ -2248,15 +2423,14 @@ def api_bulk_user_action(): return jsonify(result), 200 except Exception as e: - current_app.logger.error(f"Error performing bulk user action: {e}") + debug_print(f"Error performing bulk user action: {e}") return jsonify({'error': 'Failed to perform bulk action'}), 500 # Group Management APIs @app.route('/api/admin/control-center/groups', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_get_all_groups(): """ Get all groups with their activity data and metrics. @@ -2357,60 +2531,112 @@ def api_get_all_groups(): }), 200 except Exception as e: - current_app.logger.error(f"Error getting groups: {e}") + debug_print(f"Error getting groups: {e}") return jsonify({'error': 'Failed to retrieve groups'}), 500 @app.route('/api/admin/control-center/groups//status', methods=['PUT']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_update_group_status(group_id): """ - Update group status (active, locked, inactive, etc.) + Update group status (active, locked, upload_disabled, inactive) + Tracks who made the change and when, logs to activity_logs """ try: data = request.get_json() if not data: return jsonify({'error': 'No data provided'}), 400 - status = data.get('status') - if not status: + new_status = data.get('status') + reason = data.get('reason') # Optional reason for the status change + + if not new_status: return jsonify({'error': 'Status is required'}), 400 + + # Validate status values + valid_statuses = ['active', 'locked', 'upload_disabled', 'inactive'] + if new_status not in valid_statuses: + return jsonify({'error': f'Invalid status. Must be one of: {", ".join(valid_statuses)}'}), 400 # Get the group try: group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) except: return jsonify({'error': 'Group not found'}), 404 - - # Update group status (you may need to implement your own status logic) - group['status'] = status - group['modifiedDate'] = datetime.utcnow().isoformat() - - # Update in database - cosmos_groups_container.upsert_item(group) - # Log admin action + # Get admin user info admin_user = session.get('user', {}) - log_event("[ControlCenter] Group Status Update", { - "admin_user": admin_user.get('preferred_username', 'unknown'), - "group_id": group_id, - "group_name": group.get('name'), - "new_status": status - }) + admin_user_id = admin_user.get('oid', 'unknown') + admin_email = admin_user.get('preferred_username', 'unknown') + + # Get old status for logging + old_status = group.get('status', 'active') # Default to 'active' if not set - return jsonify({'message': 'Group status updated successfully'}), 200 + # Only update and log if status actually changed + if old_status != new_status: + # Update group status + group['status'] = new_status + group['modifiedDate'] = datetime.utcnow().isoformat() + + # Add status change metadata + if 'statusHistory' not in group: + group['statusHistory'] = [] + + group['statusHistory'].append({ + 'old_status': old_status, + 'new_status': new_status, + 'changed_by_user_id': admin_user_id, + 'changed_by_email': admin_email, + 'changed_at': datetime.utcnow().isoformat(), + 'reason': reason + }) + + # Update in database + cosmos_groups_container.upsert_item(group) + + # Log to activity_logs container for audit trail + from functions_activity_logging import log_group_status_change + log_group_status_change( + group_id=group_id, + group_name=group.get('name', 'Unknown'), + old_status=old_status, + new_status=new_status, + changed_by_user_id=admin_user_id, + changed_by_email=admin_email, + reason=reason + ) + + # Log admin action (legacy logging) + log_event("[ControlCenter] Group Status Update", { + "admin_user": admin_email, + "admin_user_id": admin_user_id, + "group_id": group_id, + "group_name": group.get('name'), + "old_status": old_status, + "new_status": new_status, + "reason": reason + }) + + return jsonify({ + 'message': 'Group status updated successfully', + 'old_status': old_status, + 'new_status': new_status + }), 200 + else: + return jsonify({ + 'message': 'Group status unchanged', + 'status': new_status + }), 200 except Exception as e: - current_app.logger.error(f"Error updating group status: {e}") + debug_print(f"Error updating group status: {e}") return jsonify({'error': 'Failed to update group status'}), 500 @app.route('/api/admin/control-center/groups/', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_get_group_details_admin(group_id): """ Get detailed information about a specific group @@ -2428,1295 +2654,4328 @@ def api_get_group_details_admin(group_id): return jsonify(enhanced_group), 200 except Exception as e: - current_app.logger.error(f"Error getting group details: {e}") + debug_print(f"Error getting group details: {e}") return jsonify({'error': 'Failed to retrieve group details'}), 500 @app.route('/api/admin/control-center/groups/', methods=['DELETE']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('admin') def api_delete_group_admin(group_id): """ - Delete a group and optionally its documents + Create an approval request to delete a group and all its documents. + Requires approval from group owner or another admin. + + Body: + reason (str): Explanation for deleting the group (required) """ try: data = request.get_json() or {} - delete_documents = data.get('delete_documents', True) # Default to True for safety + reason = data.get('reason', '').strip() + + if not reason: + return jsonify({'error': 'Reason is required for group deletion'}), 400 + + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) - # Get the group first + # Validate group exists try: group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) except: return jsonify({'error': 'Group not found'}), 404 - - # Initialize docs list - docs_to_delete = [] - - # If requested, delete all group documents - if delete_documents: - # Delete from group_documents container - docs_query = "SELECT c.id FROM c WHERE c.group_id = @group_id" - docs_params = [{"name": "@group_id", "value": group_id}] - - docs_to_delete = list(cosmos_group_documents_container.query_items( - query=docs_query, - parameters=docs_params, - enable_cross_partition_query=True - )) - - for doc in docs_to_delete: - try: - cosmos_group_documents_container.delete_item( - item=doc['id'], - partition_key=doc['id'] - ) - except Exception as doc_e: - current_app.logger.warning(f"Failed to delete document {doc['id']}: {doc_e}") - - # Delete files from Azure Storage - try: - storage_client = CLIENTS.get("storage_account_office_docs_client") - if storage_client: - container_client = storage_client.get_container_client(storage_account_user_documents_container_name) - group_folder_prefix = f"group-documents/{group_id}/" - - blob_list = container_client.list_blobs(name_starts_with=group_folder_prefix) - for blob in blob_list: - try: - container_client.delete_blob(blob.name) - except Exception as blob_e: - current_app.logger.warning(f"Failed to delete blob {blob.name}: {blob_e}") - except Exception as storage_e: - current_app.logger.warning(f"Error deleting storage files for group {group_id}: {storage_e}") - # Delete the group - cosmos_groups_container.delete_item(item=group_id, partition_key=group_id) + # Create approval request + approval = create_approval_request( + request_type=TYPE_DELETE_GROUP, + group_id=group_id, + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'group_name': group.get('name'), + 'owner_id': group.get('owner', {}).get('id'), + 'owner_email': group.get('owner', {}).get('email') + } + ) - # Log admin action - admin_user = session.get('user', {}) - log_event("[ControlCenter] Group Deletion", { - "admin_user": admin_user.get('preferred_username', 'unknown'), + # Log event + log_event("[ControlCenter] Delete Group Request Created", { + "admin_user": admin_email, "group_id": group_id, "group_name": group.get('name'), - "deleted_documents": delete_documents, - "document_count": len(docs_to_delete) + "approval_id": approval['id'], + "reason": reason }) - return jsonify({'message': 'Group deleted successfully'}), 200 + return jsonify({ + 'success': True, + 'message': 'Group deletion request created and pending approval', + 'approval_id': approval['id'], + 'status': 'pending' + }), 200 except Exception as e: - current_app.logger.error(f"Error deleting group: {e}") - return jsonify({'error': 'Failed to delete group'}), 500 + debug_print(f"Error creating group deletion request: {e}") + return jsonify({'error': str(e)}), 500 - # Public Workspaces API - @app.route('/api/admin/control-center/public-workspaces', methods=['GET']) + @app.route('/api/admin/control-center/groups//delete-documents', methods=['POST']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_control_center_public_workspaces(): + @control_center_required('admin') + def api_delete_group_documents_admin(group_id): """ - Get paginated list of public workspaces with activity data for control center management. - Similar to groups endpoint but for public workspaces. + Create an approval request to delete all documents in a group. + Requires approval from group owner or another admin. + + Body: + reason (str): Explanation for deleting documents (required) """ try: - # Parse request parameters - page = int(request.args.get('page', 1)) - per_page = min(int(request.args.get('per_page', 50)), 100) # Max 100 per page - search_term = request.args.get('search', '').strip() - status_filter = request.args.get('status_filter', 'all') - force_refresh = request.args.get('force_refresh', 'false').lower() == 'true' - export_all = request.args.get('all', 'false').lower() == 'true' # For CSV export - - # Calculate offset (only needed if not exporting all) - offset = (page - 1) * per_page if not export_all else 0 - - # Base query for public workspaces - if search_term: - # Search in workspace name and description - query = """ - SELECT * FROM c - WHERE CONTAINS(LOWER(c.name), @search_term) - OR CONTAINS(LOWER(c.description), @search_term) - ORDER BY c.name - """ - parameters = [{"name": "@search_term", "value": search_term.lower()}] - else: - # Get all workspaces - query = "SELECT * FROM c ORDER BY c.name" - parameters = [] + data = request.get_json() or {} + reason = data.get('reason', '').strip() - # Execute query to get all matching workspaces - all_workspaces = list(cosmos_public_workspaces_container.query_items( - query=query, - parameters=parameters, - enable_cross_partition_query=True - )) + if not reason: + return jsonify({'error': 'Reason is required for document deletion'}), 400 - # Apply status filter if specified - if status_filter != 'all': - # For now, we'll treat all workspaces as 'active' - # This can be enhanced later with actual status logic - if status_filter != 'active': - all_workspaces = [] + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) - # Calculate pagination - total_count = len(all_workspaces) - total_pages = math.ceil(total_count / per_page) if per_page > 0 else 0 + # Validate group exists + try: + group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) + except: + return jsonify({'error': 'Group not found'}), 404 - # Get the workspaces for current page or all for export - if export_all: - workspaces_page = all_workspaces # Get all workspaces for CSV export - else: - workspaces_page = all_workspaces[offset:offset + per_page] + # Create approval request + approval = create_approval_request( + request_type=TYPE_DELETE_DOCUMENTS, + group_id=group_id, + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'group_name': group.get('name') + } + ) - # Enhance each workspace with activity data - enhanced_workspaces = [] - for workspace in workspaces_page: - try: - enhanced_workspace = enhance_public_workspace_with_activity(workspace, force_refresh=force_refresh) - enhanced_workspaces.append(enhanced_workspace) - except Exception as enhance_e: - current_app.logger.error(f"Error enhancing workspace {workspace.get('id', 'unknown')}: {enhance_e}") - # Include the original workspace if enhancement fails - enhanced_workspaces.append(workspace) + # Log event + log_event("[ControlCenter] Delete Documents Request Created", { + "admin_user": admin_email, + "group_id": group_id, + "group_name": group.get('name'), + "approval_id": approval['id'], + "reason": reason + }) - # Return response (paginated or all for export) - if export_all: - return jsonify({ - 'success': True, - 'workspaces': enhanced_workspaces, - 'total_count': total_count, - 'filters': { - 'search': search_term, - 'status_filter': status_filter, - 'force_refresh': force_refresh - } - }) - else: - return jsonify({ - 'workspaces': enhanced_workspaces, - 'pagination': { - 'page': page, - 'per_page': per_page, - 'total_count': total_count, - 'total_pages': total_pages, - 'has_next': page < total_pages, - 'has_prev': page > 1 - }, - 'filters': { - 'search': search_term, - 'status_filter': status_filter, - 'force_refresh': force_refresh - } - }) + return jsonify({ + 'success': True, + 'message': 'Document deletion request created and pending approval', + 'approval_id': approval['id'], + 'status': 'pending' + }), 200 except Exception as e: - current_app.logger.error(f"Error getting public workspaces for control center: {e}") - return jsonify({'error': 'Failed to retrieve public workspaces'}), 500 + debug_print(f"Error creating document deletion request: {e}") + return jsonify({'error': str(e)}), 500 - # Activity Trends API - @app.route('/api/admin/control-center/activity-trends', methods=['GET']) + @app.route('/api/admin/control-center/groups//members', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_get_activity_trends(): + @control_center_required('admin') + def api_get_group_members_admin(group_id): """ - Get activity trends data for the control center dashboard. - Returns aggregated activity data from various containers. + Get list of group members for ownership transfer selection """ try: - # Check if custom start_date and end_date are provided - custom_start = request.args.get('start_date') - custom_end = request.args.get('end_date') + # Get the group + try: + group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) + except: + return jsonify({'error': 'Group not found'}), 404 - if custom_start and custom_end: - # Use custom date range - try: - start_date = datetime.fromisoformat(custom_start).replace(hour=0, minute=0, second=0, microsecond=0) - end_date = datetime.fromisoformat(custom_end).replace(hour=23, minute=59, second=59, microsecond=999999) - days = (end_date - start_date).days + 1 - debug_print(f"🔍 [Activity Trends API] Custom date range: {start_date} to {end_date} ({days} days)") - except ValueError: - return jsonify({'error': 'Invalid date format. Use YYYY-MM-DD format.'}), 400 - else: - # Use days parameter (default behavior) - days = int(request.args.get('days', 7)) - # Set end_date to end of current day to include all of today's records - end_date = datetime.now().replace(hour=23, minute=59, second=59, microsecond=999999) - start_date = (end_date - timedelta(days=days)).replace(hour=0, minute=0, second=0, microsecond=0) - debug_print(f"🔍 [Activity Trends API] Request for {days} days: {start_date} to {end_date}") + # Get member list with user details + members = [] + for member in group.get('users', []): + # Skip the current owner from the list + if member.get('userId') == group.get('owner', {}).get('id'): + continue + + members.append({ + 'userId': member.get('userId'), + 'email': member.get('email', 'No email'), + 'displayName': member.get('displayName', 'Unknown User') + }) - # Get activity data - activity_data = get_activity_trends_data(start_date, end_date) + return jsonify({'members': members}), 200 - debug_print(f"🔍 [Activity Trends API] Returning data: {activity_data}") + except Exception as e: + debug_print(f"Error getting group members: {e}") + return jsonify({'error': 'Failed to retrieve group members'}), 500 + + @app.route('/api/admin/control-center/groups//take-ownership', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_take_group_ownership(group_id): + """ + Create an approval request for admin to take ownership of a group. + Requires approval from group owner or another admin. + + Body: + reason (str): Explanation for taking ownership (required) + """ + try: + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) + + if not admin_user_id: + return jsonify({'error': 'Could not identify admin user'}), 400 + + # Get request body + data = request.get_json() or {} + reason = data.get('reason', '').strip() + + if not reason: + return jsonify({'error': 'Reason is required for ownership transfer'}), 400 + + # Validate group exists + try: + group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) + except: + return jsonify({'error': 'Group not found'}), 404 + + # Create approval request + approval = create_approval_request( + request_type=TYPE_TAKE_OWNERSHIP, + group_id=group_id, + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'old_owner_id': group.get('owner', {}).get('id'), + 'old_owner_email': group.get('owner', {}).get('email') + } + ) + + # Log event + log_event("[ControlCenter] Take Ownership Request Created", { + "admin_user": admin_email, + "group_id": group_id, + "group_name": group.get('name'), + "approval_id": approval['id'], + "reason": reason + }) return jsonify({ 'success': True, - 'activity_data': activity_data, - 'period': f"{days} days", - 'start_date': start_date.isoformat(), - 'end_date': end_date.isoformat() - }) + 'message': 'Ownership transfer request created and pending approval', + 'approval_id': approval['id'], + 'status': 'pending' + }), 200 except Exception as e: - current_app.logger.error(f"Error getting activity trends: {e}") - print(f"❌ [Activity Trends API] Error: {e}") - return jsonify({'error': 'Failed to retrieve activity trends'}), 500 - - + debug_print(f"Error creating take ownership request: {e}") + return jsonify({'error': str(e)}), 500 - @app.route('/api/admin/control-center/activity-trends/export', methods=['POST']) + @app.route('/api/admin/control-center/groups//transfer-ownership', methods=['POST']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_export_activity_trends(): + @control_center_required('admin') + def api_admin_transfer_group_ownership(group_id): """ - Export activity trends raw data as CSV file based on selected charts and date range. - Returns detailed records with user information instead of aggregated counts. + Create an approval request to transfer group ownership to another member. + Requires approval from group owner or another admin. + + Body: + newOwnerId (str): User ID of the new owner (required) + reason (str): Explanation for ownership transfer (required) """ try: - debug_print("🔍 [ACTIVITY TRENDS DEBUG] Starting CSV export process") data = request.get_json() - debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Request data: {data}") # Parse request parameters - charts = data.get('charts', ['logins', 'chats', 'documents']) # Default to all charts - time_window = data.get('time_window', '30') # Default to 30 days - start_date = data.get('start_date') # For custom range - end_date = data.get('end_date') # For custom range - debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Parsed params - charts: {charts}, time_window: {time_window}, start_date: {start_date}, end_date: {end_date}") # Determine date range - debug_print("🔍 [ACTIVITY TRENDS DEBUG] Determining date range") - if time_window == 'custom' and start_date and end_date: - try: - debug_print("🔍 [ACTIVITY TRENDS DEBUG] Processing custom dates: {start_date} to {end_date}") - start_date_obj = datetime.fromisoformat(start_date.replace('Z', '+00:00') if 'Z' in start_date else start_date) - end_date_obj = datetime.fromisoformat(end_date.replace('Z', '+00:00') if 'Z' in end_date else end_date) - end_date_obj = end_date_obj.replace(hour=23, minute=59, second=59, microsecond=999999) - debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Custom date objects created: {start_date_obj} to {end_date_obj}") - except ValueError as ve: - print(f"❌ [ACTIVITY TRENDS DEBUG] Date parsing error: {ve}") - return jsonify({'error': 'Invalid date format'}), 400 - else: - # Use predefined ranges - days = int(time_window) if time_window.isdigit() else 30 - end_date_obj = datetime.now().replace(hour=23, minute=59, second=59, microsecond=999999) - start_date_obj = end_date_obj - timedelta(days=days-1) - debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Predefined range: {days} days, from {start_date_obj} to {end_date_obj}") + new_owner_user_id = data.get('newOwnerId') + reason = data.get('reason', '').strip() - # Get raw activity data using new function - debug_print("🔍 [ACTIVITY TRENDS DEBUG] Calling get_raw_activity_trends_data") - raw_data = get_raw_activity_trends_data( - start_date_obj, - end_date_obj, - charts - ) - debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Raw data retrieved: {len(raw_data) if raw_data else 0} chart types") + if not new_owner_user_id: + return jsonify({'error': 'Missing newOwnerId'}), 400 - # Generate CSV content with all data types - import io - import csv - output = io.StringIO() - writer = csv.writer(output) + if not reason: + return jsonify({'error': 'Reason is required for ownership transfer'}), 400 - # Write data for each chart type - debug_print(f"🔍 [CSV DEBUG] Processing {len(charts)} chart types: {charts}") - for chart_type in charts: - debug_print(f"🔍 [CSV DEBUG] Processing chart type: {chart_type}") - if chart_type in raw_data and raw_data[chart_type]: - debug_print(f"🔍 [CSV DEBUG] Found {len(raw_data[chart_type])} records for {chart_type}") - # Add section header - writer.writerow([]) # Empty row for separation - section_header = f"=== {chart_type.upper()} DATA ===" - debug_print(f"🔍 [CSV DEBUG] Writing section header: {section_header}") - writer.writerow([section_header]) - - # Write headers and data based on chart type - if chart_type == 'logins': - debug_print(f"🔍 [CSV DEBUG] Writing login headers for {chart_type}") - writer.writerow(['Display Name', 'Email', 'User ID', 'Login Time']) - record_count = 0 - for record in raw_data[chart_type]: - record_count += 1 - if record_count <= 3: # Debug first 3 records - debug_print(f"🔍 [CSV DEBUG] Login record {record_count} structure: {list(record.keys())}") - debug_print(f"🔍 [CSV DEBUG] Login record {record_count} data: {record}") - writer.writerow([ - record.get('display_name', ''), - record.get('email', ''), - record.get('user_id', ''), - record.get('login_time', '') - ]) - debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} login records") - - elif chart_type in ['documents', 'personal_documents', 'group_documents', 'public_documents']: - # Handle all document types with same structure - debug_print(f"🔍 [CSV DEBUG] Writing document headers for {chart_type}") - writer.writerow([ - 'Display Name', 'Email', 'User ID', 'Document ID', 'Document Filename', - 'Document Title', 'Document Page Count', 'Document Size in AI Search', - 'Document Size in Storage Account', 'Upload Date', 'Document Type' - ]) - record_count = 0 - for record in raw_data[chart_type]: - record_count += 1 - if record_count <= 3: # Log first 3 records for debugging - debug_print(f"🔍 [CSV DEBUG] Writing {chart_type} record {record_count}: {record.get('filename', 'No filename')}") - writer.writerow([ - record.get('display_name', ''), - record.get('email', ''), - record.get('user_id', ''), - record.get('document_id', ''), - record.get('filename', ''), - record.get('title', ''), - record.get('page_count', ''), - record.get('ai_search_size', ''), - record.get('storage_account_size', ''), - record.get('upload_date', ''), - record.get('document_type', chart_type.replace('_documents', '').title()) - ]) - debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} records for {chart_type}") - - elif chart_type == 'chats': - debug_print(f"🔍 [CSV DEBUG] Writing chat headers for {chart_type}") - writer.writerow([ - 'Display Name', 'Email', 'User ID', 'Chat ID', 'Chat Title', - 'Number of Messages', 'Total Size (characters)', 'Created Date' - ]) - record_count = 0 - for record in raw_data[chart_type]: - record_count += 1 - if record_count <= 3: # Debug first 3 records - debug_print(f"🔍 [CSV DEBUG] Chat record {record_count} structure: {list(record.keys())}") - debug_print(f"🔍 [CSV DEBUG] Chat record {record_count} data: {record}") - writer.writerow([ - record.get('display_name', ''), - record.get('email', ''), - record.get('user_id', ''), - record.get('chat_id', ''), - record.get('chat_title', ''), - record.get('message_count', ''), - record.get('total_size', ''), - record.get('created_date', '') - ]) - debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} chat records") - - elif chart_type == 'tokens': - debug_print(f"🔍 [CSV DEBUG] Writing token usage headers for {chart_type}") - writer.writerow([ - 'Display Name', 'Email', 'User ID', 'Token Type', 'Model Name', - 'Prompt Tokens', 'Completion Tokens', 'Total Tokens', 'Timestamp' - ]) - record_count = 0 - for record in raw_data[chart_type]: - record_count += 1 - if record_count <= 3: # Debug first 3 records - debug_print(f"🔍 [CSV DEBUG] Token record {record_count} structure: {list(record.keys())}") - debug_print(f"🔍 [CSV DEBUG] Token record {record_count} data: {record}") - writer.writerow([ - record.get('display_name', ''), - record.get('email', ''), - record.get('user_id', ''), - record.get('token_type', ''), - record.get('model_name', ''), - record.get('prompt_tokens', ''), - record.get('completion_tokens', ''), - record.get('total_tokens', ''), - record.get('timestamp', '') - ]) - debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} token usage records") - else: - debug_print(f"🔍 [CSV DEBUG] No data found for {chart_type} - available keys: {list(raw_data.keys()) if raw_data else 'None'}") - - # Add final debug info - debug_print(f"🔍 [CSV DEBUG] Finished processing all chart types. Raw data summary:") - for key, value in raw_data.items(): - if isinstance(value, list): - debug_print(f"🔍 [CSV DEBUG] - {key}: {len(value)} records") - else: - debug_print(f"🔍 [CSV DEBUG] - {key}: {type(value)} - {value}") + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) - csv_content = output.getvalue() - debug_print(f"🔍 [CSV DEBUG] Generated CSV content length: {len(csv_content)} characters") - debug_print(f"🔍 [CSV DEBUG] CSV content preview (first 500 chars): {csv_content[:500]}") - output.close() + # Get the group + try: + group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) + except: + return jsonify({'error': 'Group not found'}), 404 - # Generate filename with timestamp - timestamp = datetime.now().strftime('%Y%m%d_%H%M%S') - filename = f"activity_trends_raw_export_{timestamp}.csv" + # Find the new owner in members list + new_owner_member = None + for member in group.get('users', []): + if member.get('userId') == new_owner_user_id: + new_owner_member = member + break + + if not new_owner_member: + return jsonify({'error': 'Selected user is not a member of this group'}), 400 + + # Create approval request + approval = create_approval_request( + request_type=TYPE_TRANSFER_OWNERSHIP, + group_id=group_id, + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'new_owner_id': new_owner_user_id, + 'new_owner_email': new_owner_member.get('email'), + 'new_owner_name': new_owner_member.get('displayName'), + 'old_owner_id': group.get('owner', {}).get('id'), + 'old_owner_email': group.get('owner', {}).get('email') + } + ) - # Return CSV as downloadable response - from flask import make_response - response = make_response(csv_content) - response.headers['Content-Type'] = 'text/csv' - response.headers['Content-Disposition'] = f'attachment; filename="{filename}"' + # Log event + log_event("[ControlCenter] Transfer Ownership Request Created", { + "admin_user": admin_email, + "group_id": group_id, + "group_name": group.get('name'), + "new_owner": new_owner_member.get('email'), + "approval_id": approval['id'], + "reason": reason + }) - return response + return jsonify({ + 'success': True, + 'message': 'Ownership transfer request created and pending approval', + 'approval_id': approval['id'], + 'status': 'pending' + }), 200 except Exception as e: - current_app.logger.error(f"Error exporting activity trends: {e}") - return jsonify({'error': 'Failed to export data'}), 500 + debug_print(f"Error creating transfer ownership request: {e}") + return jsonify({'error': str(e)}), 500 - @app.route('/api/admin/control-center/activity-trends/chat', methods=['POST']) + @app.route('/api/admin/control-center/groups//add-member', methods=['POST']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_chat_activity_trends(): + @control_center_required('admin') + def api_admin_add_group_member(group_id): """ - Create a new chat conversation with activity trends data as CSV message. + Admin adds a member to a group (used by both single add and CSV bulk upload) """ try: data = request.get_json() + user_id = data.get('userId') + # Support both 'name' (from CSV) and 'displayName' (from single add form) + name = data.get('displayName') or data.get('name') + email = data.get('email') + role = data.get('role', 'user').lower() - # Parse request parameters - charts = data.get('charts', ['logins', 'chats', 'documents']) # Default to all charts - time_window = data.get('time_window', '30') # Default to 30 days - start_date = data.get('start_date') # For custom range - end_date = data.get('end_date') # For custom range + if not user_id or not name or not email: + return jsonify({'error': 'Missing required fields: userId, name/displayName, email'}), 400 - # Determine date range - if time_window == 'custom' and start_date and end_date: - try: - start_date_obj = datetime.fromisoformat(start_date.replace('Z', '+00:00') if 'Z' in start_date else start_date) - end_date_obj = datetime.fromisoformat(end_date.replace('Z', '+00:00') if 'Z' in end_date else end_date) - end_date_obj = end_date_obj.replace(hour=23, minute=59, second=59, microsecond=999999) - except ValueError: - return jsonify({'error': 'Invalid date format'}), 400 - else: - # Use predefined ranges - days = int(time_window) if time_window.isdigit() else 30 - end_date_obj = datetime.now().replace(hour=23, minute=59, second=59, microsecond=999999) - start_date_obj = end_date_obj - timedelta(days=days-1) - - # Get activity data using existing function - activity_data = get_activity_trends_data( - start_date_obj.strftime('%Y-%m-%d'), - end_date_obj.strftime('%Y-%m-%d') - ) - - # Prepare CSV data - csv_rows = [] - csv_rows.append(['Date', 'Chart Type', 'Activity Count']) - - # Process each requested chart type - for chart_type in charts: - if chart_type in activity_data: - chart_data = activity_data[chart_type] - # Sort dates for consistent output - sorted_dates = sorted(chart_data.keys()) - - for date_key in sorted_dates: - count = chart_data[date_key] - chart_display_name = { - 'logins': 'Logins', - 'chats': 'Chats', - 'documents': 'Documents', - 'personal_documents': 'Personal Documents', - 'group_documents': 'Group Documents', - 'public_documents': 'Public Documents' - }.get(chart_type, chart_type.title()) - - csv_rows.append([date_key, chart_display_name, count]) + # Validate role + valid_roles = ['admin', 'document_manager', 'user'] + if role not in valid_roles: + return jsonify({'error': f'Invalid role. Must be: {", ".join(valid_roles)}'}), 400 - # Generate CSV content - import io - import csv - output = io.StringIO() - writer = csv.writer(output) - writer.writerows(csv_rows) - csv_content = output.getvalue() - output.close() + admin_user = session.get('user', {}) + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) - # Get current user info - user_id = session.get('user_id') - user_email = session.get('email') - user_display_name = session.get('display_name', user_email) + # Get the group + try: + group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) + except: + return jsonify({'error': 'Group not found'}), 404 - if not user_id: - return jsonify({'error': 'User not authenticated'}), 401 + # Check if user already exists (skip duplicate) + existing_user = False + for member in group.get('users', []): + if member.get('userId') == user_id: + existing_user = True + break - # Create new conversation - conversation_id = str(uuid.uuid4()) - timestamp = datetime.now(timezone.utc).isoformat() + if existing_user: + return jsonify({ + 'message': f'User {email} already exists in group', + 'skipped': True + }), 200 - # Generate descriptive title with date range - if time_window == 'custom': - date_range = f"{start_date} to {end_date}" - else: - date_range = f"Last {time_window} Days" + # Add user to users array + group.setdefault('users', []).append({ + 'userId': user_id, + 'email': email, + 'displayName': name + }) - charts_text = ", ".join([c.title() for c in charts]) - conversation_title = f"Activity Trends - {charts_text} ({date_range})" + # Add to appropriate role array + if role == 'admin': + if user_id not in group.get('admins', []): + group.setdefault('admins', []).append(user_id) + elif role == 'document_manager': + if user_id not in group.get('documentManagers', []): + group.setdefault('documentManagers', []).append(user_id) - # Create conversation document - conversation_doc = { - "id": conversation_id, - "title": conversation_title, - "user_id": user_id, - "user_email": user_email, - "user_display_name": user_display_name, - "created": timestamp, - "last_updated": timestamp, - "messages": [], - "system_message": "You are analyzing activity trends data from a control center dashboard. The user has provided activity data as a CSV file. Please analyze the data and provide insights about user activity patterns, trends, and any notable observations.", - "message_count": 0, - "settings": { - "model": "gpt-4o", - "temperature": 0.7, - "max_tokens": 4000 - } - } + # Update modification timestamp + group['modifiedDate'] = datetime.utcnow().isoformat() - # Create the initial message with CSV data (simulate file upload) - message_id = str(uuid.uuid4()) - csv_filename = f"activity_trends_{datetime.now().strftime('%Y%m%d_%H%M%S')}.csv" + # Save group + cosmos_groups_container.upsert_item(group) - # Create message with file attachment structure - initial_message = { - "id": message_id, - "role": "user", - "content": f"Please analyze this activity trends data from our system dashboard. The data covers {date_range} and includes {charts_text} activity.", - "timestamp": timestamp, - "files": [{ - "name": csv_filename, - "type": "text/csv", - "size": len(csv_content.encode('utf-8')), - "content": csv_content, - "id": str(uuid.uuid4()) - }] + # Determine the action source (single add vs bulk CSV) + source = data.get('source', 'csv') # Default to 'csv' for backward compatibility + action_type = 'add_member_directly' if source == 'single' else 'admin_add_member_csv' + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'activity_type': action_type, + 'timestamp': datetime.utcnow().isoformat(), + 'admin_user_id': admin_user.get('oid') or admin_user.get('sub'), + 'admin_email': admin_email, + 'group_id': group_id, + 'group_name': group.get('name', 'Unknown'), + 'member_user_id': user_id, + 'member_email': email, + 'member_name': name, + 'member_role': role, + 'source': source, + 'description': f"Admin {admin_email} added member {name} ({email}) to group {group.get('name', group_id)} as {role}" } + cosmos_activity_logs_container.create_item(body=activity_record) - conversation_doc["messages"].append(initial_message) - conversation_doc["message_count"] = 1 - - # Save conversation to database - cosmos_conversations_container.create_item(conversation_doc) - - # Log the activity - log_event("[ControlCenter] Activity Trends Chat Created", { - "conversation_id": conversation_id, - "user_id": user_id, - "charts": charts, - "time_window": time_window, - "date_range": date_range + # Log to Application Insights + log_event("[ControlCenter] Admin Add Group Member", { + "admin_user": admin_email, + "group_id": group_id, + "group_name": group.get('name'), + "member_email": email, + "member_role": role }) return jsonify({ - 'success': True, - 'conversation_id': conversation_id, - 'conversation_title': conversation_title, - 'redirect_url': f'/chat/{conversation_id}' + 'message': f'Member {email} added successfully', + 'skipped': False }), 200 except Exception as e: - current_app.logger.error(f"Error creating activity trends chat: {e}") - return jsonify({'error': 'Failed to create chat conversation'}), 500 - - # Data Refresh API - @app.route('/api/admin/control-center/refresh', methods=['POST']) + debug_print(f"Error adding group member: {e}") + return jsonify({'error': 'Failed to add member'}), 500 + + @app.route('/api/admin/control-center/groups//activity', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_refresh_control_center_data(): + @control_center_required('admin') + def api_admin_get_group_activity(group_id): """ - Refresh all Control Center metrics data and update admin timestamp. - This will recalculate all user metrics and cache them in user settings. + Get activity timeline for a specific group from activity logs + Returns document creation/deletion, member changes, status changes, and conversations """ try: - debug_print("🔄 [REFRESH DEBUG] Starting Control Center data refresh...") - current_app.logger.info("Starting Control Center data refresh...") + # Get time range filter (default: last 30 days) + days = request.args.get('days', '30') - # Check if request has specific user_id - from flask import request - try: - request_data = request.get_json(force=True) or {} - except: - # Handle case where no JSON body is sent - request_data = {} - - specific_user_id = request_data.get('user_id') - force_refresh = request_data.get('force_refresh', False) + # Calculate date filter + cutoff_date = None + if days != 'all': + try: + days_int = int(days) + cutoff_date = (datetime.utcnow() - timedelta(days=days_int)).isoformat() + except ValueError: + pass - debug_print(f"🔄 [REFRESH DEBUG] Request data: user_id={specific_user_id}, force_refresh={force_refresh}") + # Build queries - use two separate queries to avoid nested property access issues + # Query 1: Activities with c.group.group_id (member/status changes) + # Query 2: Activities with c.workspace_context.group_id (document operations) - # Get all users to refresh their metrics - debug_print("🔄 [REFRESH DEBUG] Querying all users...") - users_query = "SELECT c.id, c.email, c.display_name, c.lastUpdated, c.settings FROM c" - all_users = list(cosmos_user_settings_container.query_items( - query=users_query, - enable_cross_partition_query=True - )) - debug_print(f"🔄 [REFRESH DEBUG] Found {len(all_users)} users to process") + time_filter = "AND c.timestamp >= @cutoff_date" if cutoff_date else "" - refreshed_count = 0 - failed_count = 0 + # Query 1: Member and status activities (all activity types with c.group.group_id) + # Use SELECT * to get complete raw documents for modal display + query1 = f""" + SELECT * + FROM c + WHERE c.group.group_id = @group_id + {time_filter} + """ - # Refresh metrics for each user - debug_print("🔄 [REFRESH DEBUG] Starting user refresh loop...") - for user in all_users: - try: - user_id = user.get('id') - debug_print(f"🔄 [REFRESH DEBUG] Processing user {user_id}") - - # Force refresh of metrics for this user - enhanced_user = enhance_user_with_activity(user, force_refresh=True) - refreshed_count += 1 - - debug_print(f"✅ [REFRESH DEBUG] Successfully refreshed user {user_id}") - current_app.logger.debug(f"Refreshed metrics for user {user_id}") - except Exception as user_error: - failed_count += 1 - debug_print(f"❌ [REFRESH DEBUG] Failed to refresh user {user.get('id')}: {user_error}") - debug_print(f"❌ [REFRESH DEBUG] User error traceback:") - import traceback - debug_print(traceback.format_exc()) - current_app.logger.error(f"Failed to refresh metrics for user {user.get('id')}: {user_error}") + # Query 2: Document activities (all activity types with c.workspace_context.group_id) + # Use SELECT * to get complete raw documents for modal display + query2 = f""" + SELECT * + FROM c + WHERE c.workspace_context.group_id = @group_id + {time_filter} + """ - debug_print(f"🔄 [REFRESH DEBUG] User refresh loop completed. Refreshed: {refreshed_count}, Failed: {failed_count}") + # Log the queries for debugging + debug_print(f"[Group Activity] Querying for group: {group_id}, days: {days}") + debug_print(f"[Group Activity] Query 1: {query1}") + debug_print(f"[Group Activity] Query 2: {query2}") - # Refresh metrics for all groups - debug_print("🔄 [REFRESH DEBUG] Starting group refresh...") - groups_refreshed_count = 0 - groups_failed_count = 0 + parameters = [ + {"name": "@group_id", "value": group_id} + ] + + if cutoff_date: + parameters.append({"name": "@cutoff_date", "value": cutoff_date}) + + debug_print(f"[Group Activity] Parameters: {parameters}") + + # Execute both queries + activities = [] try: - groups_query = "SELECT * FROM c" - all_groups = list(cosmos_groups_container.query_items( - query=groups_query, + # Query 1: Member and status activities + activities1 = list(cosmos_activity_logs_container.query_items( + query=query1, + parameters=parameters, enable_cross_partition_query=True )) - debug_print(f"🔄 [REFRESH DEBUG] Found {len(all_groups)} groups to process") - - # Refresh metrics for each group - for group in all_groups: - try: - group_id = group.get('id') - debug_print(f"🔄 [REFRESH DEBUG] Processing group {group_id}") - - # Force refresh of metrics for this group - enhanced_group = enhance_group_with_activity(group, force_refresh=True) - groups_refreshed_count += 1 - - debug_print(f"✅ [REFRESH DEBUG] Successfully refreshed group {group_id}") - current_app.logger.debug(f"Refreshed metrics for group {group_id}") - except Exception as group_error: - groups_failed_count += 1 - debug_print(f"❌ [REFRESH DEBUG] Failed to refresh group {group.get('id')}: {group_error}") - debug_print(f"❌ [REFRESH DEBUG] Group error traceback:") - import traceback - debug_print(traceback.format_exc()) - current_app.logger.error(f"Failed to refresh metrics for group {group.get('id')}: {group_error}") - - except Exception as groups_error: - debug_print(f"❌ [REFRESH DEBUG] Error querying groups: {groups_error}") - current_app.logger.error(f"Error querying groups for refresh: {groups_error}") - - debug_print(f"🔄 [REFRESH DEBUG] Group refresh loop completed. Refreshed: {groups_refreshed_count}, Failed: {groups_failed_count}") + debug_print(f"[Group Activity] Query 1 returned {len(activities1)} activities") + activities.extend(activities1) + except Exception as e: + debug_print(f"[Group Activity] Query 1 failed: {e}") - # Update admin settings with refresh timestamp - debug_print("🔄 [REFRESH DEBUG] Updating admin settings...") try: - from functions_settings import get_settings, update_settings + # Query 2: Document activities + activities2 = list(cosmos_activity_logs_container.query_items( + query=query2, + parameters=parameters, + enable_cross_partition_query=True + )) + debug_print(f"[Group Activity] Query 2 returned {len(activities2)} activities") + activities.extend(activities2) + except Exception as e: + debug_print(f"[Group Activity] Query 2 failed: {e}") + + # Sort combined results by timestamp descending + activities.sort(key=lambda x: x.get('timestamp', ''), reverse=True) + + # Format activities for timeline display + formatted_activities = [] + for activity in activities: + formatted = { + 'id': activity.get('id'), + 'type': activity.get('activity_type'), + 'timestamp': activity.get('timestamp'), + 'user_id': activity.get('user_id'), + 'description': activity.get('description', '') + } + + # Add type-specific details + activity_type = activity.get('activity_type') + + if activity_type == 'document_creation': + doc = activity.get('document', {}) + formatted['document'] = { + 'file_name': doc.get('file_name'), + 'file_type': doc.get('file_type'), + 'file_size_bytes': doc.get('file_size_bytes'), + 'page_count': doc.get('page_count') + } + formatted['icon'] = 'file-earmark-plus' + formatted['color'] = 'success' + + elif activity_type == 'document_deletion': + doc = activity.get('document', {}) + formatted['document'] = { + 'file_name': doc.get('file_name'), + 'file_type': doc.get('file_type') + } + formatted['icon'] = 'file-earmark-minus' + formatted['color'] = 'danger' + + elif activity_type == 'document_metadata_update': + doc = activity.get('document', {}) + formatted['document'] = { + 'file_name': doc.get('file_name') + } + formatted['icon'] = 'pencil-square' + formatted['color'] = 'info' + + elif activity_type == 'group_member_added': + added_by = activity.get('added_by', {}) + added_member = activity.get('added_member', {}) + formatted['member'] = { + 'name': added_member.get('name'), + 'email': added_member.get('email'), + 'role': added_member.get('role') + } + formatted['added_by'] = { + 'email': added_by.get('email'), + 'role': added_by.get('role') + } + formatted['icon'] = 'person-plus' + formatted['color'] = 'primary' + + elif activity_type == 'group_member_deleted': + removed_by = activity.get('removed_by', {}) + removed_member = activity.get('removed_member', {}) + formatted['member'] = { + 'name': removed_member.get('name'), + 'email': removed_member.get('email') + } + formatted['removed_by'] = { + 'email': removed_by.get('email'), + 'role': removed_by.get('role') + } + formatted['icon'] = 'person-dash' + formatted['color'] = 'warning' + + elif activity_type == 'group_status_change': + status_change = activity.get('status_change', {}) + formatted['status_change'] = { + 'from_status': status_change.get('old_status'), # Use old_status from log + 'to_status': status_change.get('new_status') # Use new_status from log + } + formatted['icon'] = 'shield-lock' + formatted['color'] = 'secondary' + + elif activity_type == 'conversation_creation': + formatted['icon'] = 'chat-dots' + formatted['color'] = 'info' + + elif activity_type == 'token_usage': + usage = activity.get('usage', {}) + formatted['token_usage'] = { + 'total_tokens': usage.get('total_tokens'), + 'prompt_tokens': usage.get('prompt_tokens'), + 'completion_tokens': usage.get('completion_tokens'), + 'model': usage.get('model'), + 'token_type': activity.get('token_type') # 'chat' or 'embedding' + } + # Add chat details if available + chat_details = activity.get('chat_details', {}) + if chat_details: + formatted['token_usage']['conversation_id'] = chat_details.get('conversation_id') + formatted['token_usage']['message_id'] = chat_details.get('message_id') + # Add embedding details if available + embedding_details = activity.get('embedding_details', {}) + if embedding_details: + formatted['token_usage']['document_id'] = embedding_details.get('document_id') + formatted['token_usage']['file_name'] = embedding_details.get('file_name') + formatted['icon'] = 'cpu' + formatted['color'] = 'info' - settings = get_settings() - if settings: - settings['control_center_last_refresh'] = datetime.now(timezone.utc).isoformat() - update_success = update_settings(settings) - - if not update_success: - debug_print("⚠️ [REFRESH DEBUG] Failed to update admin settings") - current_app.logger.warning("Failed to update admin settings with refresh timestamp") - else: - debug_print("✅ [REFRESH DEBUG] Admin settings updated successfully") - current_app.logger.info("Updated admin settings with refresh timestamp") else: - debug_print("⚠️ [REFRESH DEBUG] Could not get admin settings") - - except Exception as admin_error: - debug_print(f"❌ [REFRESH DEBUG] Admin settings update failed: {admin_error}") - current_app.logger.error(f"Error updating admin settings: {admin_error}") - - debug_print(f"🎉 [REFRESH DEBUG] Refresh completed! Users - Refreshed: {refreshed_count}, Failed: {failed_count}. Groups - Refreshed: {groups_refreshed_count}, Failed: {groups_failed_count}") - current_app.logger.info(f"Control Center data refresh completed. Users: {refreshed_count} refreshed, {failed_count} failed. Groups: {groups_refreshed_count} refreshed, {groups_failed_count} failed") + # Fallback for unknown activity types - still show them! + formatted['icon'] = 'circle' + formatted['color'] = 'secondary' + # Keep any additional data that might be in the activity + if activity.get('status_change'): + formatted['status_change'] = activity.get('status_change') + if activity.get('document'): + formatted['document'] = activity.get('document') + if activity.get('group'): + formatted['group'] = activity.get('group') + + formatted_activities.append(formatted) return jsonify({ - 'success': True, - 'message': 'Control Center data refreshed successfully', - 'refreshed_users': refreshed_count, - 'failed_users': failed_count, - 'refreshed_groups': groups_refreshed_count, - 'failed_groups': groups_failed_count, - 'refresh_timestamp': datetime.now(timezone.utc).isoformat() + 'group_id': group_id, + 'activities': formatted_activities, + 'raw_activities': activities, # Include raw activities for modal display + 'count': len(formatted_activities), + 'time_range_days': days }), 200 except Exception as e: - debug_print(f"💥 [REFRESH DEBUG] MAJOR ERROR in refresh endpoint: {e}") - debug_print("💥 [REFRESH DEBUG] Full traceback:") + debug_print(f"Error fetching group activity: {e}") import traceback - debug_print(traceback.format_exc()) - current_app.logger.error(f"Error refreshing Control Center data: {e}") - return jsonify({'error': 'Failed to refresh data'}), 500 - - # Get refresh status API - @app.route('/api/admin/control-center/refresh-status', methods=['GET']) + traceback.print_exc() + return jsonify({'error': f'Failed to fetch group activity: {str(e)}'}), 500 + + # Public Workspaces API + @app.route('/api/admin/control-center/public-workspaces', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_get_refresh_status(): + @control_center_required('admin') + def api_control_center_public_workspaces(): """ - Get the last refresh timestamp for Control Center data. + Get paginated list of public workspaces with activity data for control center management. + Similar to groups endpoint but for public workspaces. """ try: - from functions_settings import get_settings + # Parse request parameters + page = int(request.args.get('page', 1)) + per_page = min(int(request.args.get('per_page', 50)), 100) # Max 100 per page + search_term = request.args.get('search', '').strip() + status_filter = request.args.get('status_filter', 'all') + force_refresh = request.args.get('force_refresh', 'false').lower() == 'true' + export_all = request.args.get('all', 'false').lower() == 'true' # For CSV export - settings = get_settings() - last_refresh = settings.get('control_center_last_refresh') + # Calculate offset (only needed if not exporting all) + offset = (page - 1) * per_page if not export_all else 0 - return jsonify({ - 'last_refresh': last_refresh, - 'last_refresh_formatted': None if not last_refresh else datetime.fromisoformat(last_refresh.replace('Z', '+00:00') if 'Z' in last_refresh else last_refresh).strftime('%m/%d/%Y %I:%M %p UTC') - }), 200 + # Base query for public workspaces + if search_term: + # Search in workspace name and description + query = """ + SELECT * FROM c + WHERE CONTAINS(LOWER(c.name), @search_term) + OR CONTAINS(LOWER(c.description), @search_term) + ORDER BY c.name + """ + parameters = [{"name": "@search_term", "value": search_term.lower()}] + else: + # Get all workspaces + query = "SELECT * FROM c ORDER BY c.name" + parameters = [] + + # Execute query to get all matching workspaces + all_workspaces = list(cosmos_public_workspaces_container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + # Apply status filter if specified + if status_filter != 'all': + # For now, we'll treat all workspaces as 'active' + # This can be enhanced later with actual status logic + if status_filter != 'active': + all_workspaces = [] + + # Calculate pagination + total_count = len(all_workspaces) + total_pages = math.ceil(total_count / per_page) if per_page > 0 else 0 + + # Get the workspaces for current page or all for export + if export_all: + workspaces_page = all_workspaces # Get all workspaces for CSV export + else: + workspaces_page = all_workspaces[offset:offset + per_page] + + # Enhance each workspace with activity data + enhanced_workspaces = [] + for workspace in workspaces_page: + try: + enhanced_workspace = enhance_public_workspace_with_activity(workspace, force_refresh=force_refresh) + enhanced_workspaces.append(enhanced_workspace) + except Exception as enhance_e: + debug_print(f"Error enhancing workspace {workspace.get('id', 'unknown')}: {enhance_e}") + # Include the original workspace if enhancement fails + enhanced_workspaces.append(workspace) + + # Return response (paginated or all for export) + if export_all: + return jsonify({ + 'success': True, + 'workspaces': enhanced_workspaces, + 'total_count': total_count, + 'filters': { + 'search': search_term, + 'status_filter': status_filter, + 'force_refresh': force_refresh + } + }) + else: + return jsonify({ + 'workspaces': enhanced_workspaces, + 'pagination': { + 'page': page, + 'per_page': per_page, + 'total_count': total_count, + 'total_pages': total_pages, + 'has_next': page < total_pages, + 'has_prev': page > 1 + }, + 'filters': { + 'search': search_term, + 'status_filter': status_filter, + 'force_refresh': force_refresh + } + }) except Exception as e: - current_app.logger.error(f"Error getting refresh status: {e}") - return jsonify({'error': 'Failed to get refresh status'}), 500 - - # Activity Log Migration APIs - @app.route('/api/admin/control-center/migrate/status', methods=['GET']) + debug_print(f"Error getting public workspaces for control center: {e}") + return jsonify({'error': 'Failed to retrieve public workspaces'}), 500 + + @app.route('/api/admin/control-center/public-workspaces//status', methods=['PUT']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_get_migration_status(): + @control_center_required('admin') + def api_update_public_workspace_status(workspace_id): """ - Check if there are conversations and documents that need to be migrated to activity logs. - Returns counts of records without the 'added_to_activity_log' flag. + Update public workspace status (active, locked, upload_disabled, inactive) + Tracks who made the change and when, logs to activity_logs """ try: - migration_status = { - 'conversations_without_logs': 0, - 'personal_documents_without_logs': 0, - 'group_documents_without_logs': 0, - 'public_documents_without_logs': 0, - 'total_documents_without_logs': 0, - 'migration_needed': False, - 'estimated_total_records': 0 - } - - # Check conversations without the flag - try: - conversations_query = """ - SELECT VALUE COUNT(1) - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - conversations_result = list(cosmos_conversations_container.query_items( - query=conversations_query, - enable_cross_partition_query=True - )) - migration_status['conversations_without_logs'] = conversations_result[0] if conversations_result else 0 - except Exception as e: - current_app.logger.warning(f"Error checking conversations migration status: {e}") - - # Check personal documents without the flag - try: - personal_docs_query = """ - SELECT VALUE COUNT(1) - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - personal_docs_result = list(cosmos_user_documents_container.query_items( - query=personal_docs_query, - enable_cross_partition_query=True - )) - migration_status['personal_documents_without_logs'] = personal_docs_result[0] if personal_docs_result else 0 - except Exception as e: - current_app.logger.warning(f"Error checking personal documents migration status: {e}") + data = request.get_json() + if not data: + return jsonify({'error': 'No data provided'}), 400 + + new_status = data.get('status') + reason = data.get('reason') # Optional reason for the status change - # Check group documents without the flag - try: - group_docs_query = """ - SELECT VALUE COUNT(1) - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - group_docs_result = list(cosmos_group_documents_container.query_items( - query=group_docs_query, - enable_cross_partition_query=True - )) - migration_status['group_documents_without_logs'] = group_docs_result[0] if group_docs_result else 0 - except Exception as e: - current_app.logger.warning(f"Error checking group documents migration status: {e}") + if not new_status: + return jsonify({'error': 'Status is required'}), 400 - # Check public documents without the flag + # Validate status values + valid_statuses = ['active', 'locked', 'upload_disabled', 'inactive'] + if new_status not in valid_statuses: + return jsonify({'error': f'Invalid status. Must be one of: {", ".join(valid_statuses)}'}), 400 + + # Get the workspace try: - public_docs_query = """ - SELECT VALUE COUNT(1) - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - public_docs_result = list(cosmos_public_documents_container.query_items( - query=public_docs_query, - enable_cross_partition_query=True - )) - migration_status['public_documents_without_logs'] = public_docs_result[0] if public_docs_result else 0 - except Exception as e: - current_app.logger.warning(f"Error checking public documents migration status: {e}") - - # Calculate totals - migration_status['total_documents_without_logs'] = ( - migration_status['personal_documents_without_logs'] + - migration_status['group_documents_without_logs'] + - migration_status['public_documents_without_logs'] - ) - - migration_status['estimated_total_records'] = ( - migration_status['conversations_without_logs'] + - migration_status['total_documents_without_logs'] - ) + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + except: + return jsonify({'error': 'Public workspace not found'}), 404 - migration_status['migration_needed'] = migration_status['estimated_total_records'] > 0 + # Get admin user info + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid', 'unknown') + admin_email = admin_user.get('preferred_username', 'unknown') - return jsonify(migration_status), 200 + # Get old status for logging + old_status = workspace.get('status', 'active') # Default to 'active' if not set + # Only update and log if status actually changed + if old_status != new_status: + # Update workspace status + workspace['status'] = new_status + workspace['modifiedDate'] = datetime.utcnow().isoformat() + + # Add status change metadata + if 'statusHistory' not in workspace: + workspace['statusHistory'] = [] + + workspace['statusHistory'].append({ + 'old_status': old_status, + 'new_status': new_status, + 'changed_by_user_id': admin_user_id, + 'changed_by_email': admin_email, + 'changed_at': datetime.utcnow().isoformat(), + 'reason': reason + }) + + # Update in database + cosmos_public_workspaces_container.upsert_item(workspace) + + # Log to activity_logs container for audit trail + from functions_activity_logging import log_public_workspace_status_change + log_public_workspace_status_change( + workspace_id=workspace_id, + workspace_name=workspace.get('name', 'Unknown'), + old_status=old_status, + new_status=new_status, + changed_by_user_id=admin_user_id, + changed_by_email=admin_email, + reason=reason + ) + + # Log admin action (legacy logging) + log_event("[ControlCenter] Public Workspace Status Update", { + "admin_user": admin_email, + "admin_user_id": admin_user_id, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "old_status": old_status, + "new_status": new_status, + "reason": reason + }) + + return jsonify({ + 'message': 'Public workspace status updated successfully', + 'old_status': old_status, + 'new_status': new_status + }), 200 + else: + return jsonify({ + 'message': 'Status unchanged', + 'status': new_status + }), 200 + except Exception as e: - current_app.logger.error(f"Error getting migration status: {e}") - return jsonify({'error': 'Failed to get migration status'}), 500 - - @app.route('/api/admin/control-center/migrate/all', methods=['POST']) + debug_print(f"Error updating public workspace status: {e}") + return jsonify({'error': 'Failed to update public workspace status'}), 500 + + @app.route('/api/admin/control-center/public-workspaces/bulk-action', methods=['POST']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required - def api_migrate_to_activity_logs(): + @control_center_required('admin') + def api_bulk_public_workspace_action(): """ - Migrate all conversations and documents without activity logs. - This adds activity log records and sets the 'added_to_activity_log' flag. - - WARNING: This may take a while for large datasets and could impact performance. - Recommended to run during off-peak hours. + Perform bulk actions on multiple public workspaces. + Actions: lock, unlock, disable_uploads, enable_uploads, delete_documents """ try: - from functions_activity_logging import log_conversation_creation, log_document_creation_transaction + data = request.get_json() + if not data: + return jsonify({'error': 'No data provided'}), 400 + + workspace_ids = data.get('workspace_ids', []) + action = data.get('action') + reason = data.get('reason') # Optional reason - results = { - 'conversations_migrated': 0, - 'conversations_failed': 0, - 'personal_documents_migrated': 0, - 'personal_documents_failed': 0, - 'group_documents_migrated': 0, - 'group_documents_failed': 0, - 'public_documents_migrated': 0, - 'public_documents_failed': 0, - 'total_migrated': 0, - 'total_failed': 0, - 'errors': [] + if not workspace_ids or not isinstance(workspace_ids, list): + return jsonify({'error': 'workspace_ids must be a non-empty array'}), 400 + + if not action: + return jsonify({'error': 'Action is required'}), 400 + + # Validate action + valid_actions = ['lock', 'unlock', 'disable_uploads', 'enable_uploads', 'delete_documents'] + if action not in valid_actions: + return jsonify({'error': f'Invalid action. Must be one of: {", ".join(valid_actions)}'}), 400 + + # Get admin user info + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid', 'unknown') + admin_email = admin_user.get('preferred_username', 'unknown') + + # Map actions to status values + action_to_status = { + 'lock': 'locked', + 'unlock': 'active', + 'disable_uploads': 'upload_disabled', + 'enable_uploads': 'active' } - # Migrate conversations - current_app.logger.info("Starting conversation migration...") - try: - conversations_query = """ - SELECT * - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - conversations = list(cosmos_conversations_container.query_items( - query=conversations_query, - enable_cross_partition_query=True - )) - - current_app.logger.info(f"Found {len(conversations)} conversations to migrate") - - for conv in conversations: - try: - # Create activity log directly to preserve original timestamp - activity_log = { - 'id': str(uuid.uuid4()), - 'activity_type': 'conversation_creation', - 'user_id': conv.get('user_id'), - 'timestamp': conv.get('created_at') or conv.get('last_updated') or datetime.utcnow().isoformat(), - 'created_at': conv.get('created_at') or conv.get('last_updated') or datetime.utcnow().isoformat(), - 'conversation': { - 'conversation_id': conv.get('id'), - 'title': conv.get('title', 'Untitled'), - 'context': conv.get('context', []), - 'tags': conv.get('tags', []) - }, - 'workspace_type': 'personal', - 'workspace_context': {} - } - - # Save to activity logs container - cosmos_activity_logs_container.upsert_item(activity_log) - - # Add flag to conversation - conv['added_to_activity_log'] = True - cosmos_conversations_container.upsert_item(conv) - - results['conversations_migrated'] += 1 + successful = [] + failed = [] + + for workspace_id in workspace_ids: + try: + # Get the workspace + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + + if action == 'delete_documents': + # Delete all documents for this workspace + # Query all documents + doc_query = "SELECT c.id FROM c WHERE c.public_workspace_id = @workspace_id" + doc_params = [{"name": "@workspace_id", "value": workspace_id}] - except Exception as conv_error: - results['conversations_failed'] += 1 - error_msg = f"Failed to migrate conversation {conv.get('id')}: {str(conv_error)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) + docs_to_delete = list(cosmos_public_documents_container.query_items( + query=doc_query, + parameters=doc_params, + enable_cross_partition_query=True + )) - except Exception as e: - error_msg = f"Error during conversation migration: {str(e)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) - - # Migrate personal documents - current_app.logger.info("Starting personal documents migration...") - try: - personal_docs_query = """ - SELECT * - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - personal_docs = list(cosmos_user_documents_container.query_items( - query=personal_docs_query, - enable_cross_partition_query=True - )) - - for doc in personal_docs: - try: - # Create activity log directly to preserve original timestamp - activity_log = { - 'id': str(uuid.uuid4()), - 'user_id': doc.get('user_id'), - 'activity_type': 'document_creation', - 'workspace_type': 'personal', - 'timestamp': doc.get('upload_date') or datetime.utcnow().isoformat(), - 'created_at': doc.get('upload_date') or datetime.utcnow().isoformat(), - 'document': { - 'document_id': doc.get('id'), - 'file_name': doc.get('file_name', 'Unknown'), - 'file_type': doc.get('file_type', 'unknown'), - 'file_size_bytes': doc.get('file_size', 0), - 'page_count': doc.get('number_of_pages', 0), - 'version': doc.get('version', 1) - }, - 'embedding_usage': { - 'total_tokens': doc.get('embedding_tokens', 0), - 'model_deployment_name': doc.get('embedding_model_deployment_name', 'unknown') - }, - 'document_metadata': { - 'author': doc.get('author'), - 'title': doc.get('title'), - 'subject': doc.get('subject'), - 'publication_date': doc.get('publication_date'), - 'keywords': doc.get('keywords', []), - 'abstract': doc.get('abstract') - }, - 'workspace_context': {} - } + deleted_count = 0 + for doc in docs_to_delete: + try: + delete_document_chunks(doc['id']) + delete_document(doc['id']) + deleted_count += 1 + except Exception as del_e: + debug_print(f"Error deleting document {doc['id']}: {del_e}") - # Save to activity logs container - cosmos_activity_logs_container.upsert_item(activity_log) + successful.append({ + 'workspace_id': workspace_id, + 'workspace_name': workspace.get('name', 'Unknown'), + 'action': action, + 'documents_deleted': deleted_count + }) - # Add flag to document - doc['added_to_activity_log'] = True - cosmos_user_documents_container.upsert_item(doc) + # Log the action + log_event("[ControlCenter] Bulk Public Workspace Documents Deleted", { + "admin_user": admin_email, + "admin_user_id": admin_user_id, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "documents_deleted": deleted_count, + "reason": reason + }) - results['personal_documents_migrated'] += 1 + else: + # Status change action + new_status = action_to_status[action] + old_status = workspace.get('status', 'active') - except Exception as doc_error: - results['personal_documents_failed'] += 1 - error_msg = f"Failed to migrate personal document {doc.get('id')}: {str(doc_error)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) + if old_status != new_status: + workspace['status'] = new_status + workspace['modifiedDate'] = datetime.utcnow().isoformat() + + # Add status history + if 'statusHistory' not in workspace: + workspace['statusHistory'] = [] + + workspace['statusHistory'].append({ + 'old_status': old_status, + 'new_status': new_status, + 'changed_by_user_id': admin_user_id, + 'changed_by_email': admin_email, + 'changed_at': datetime.utcnow().isoformat(), + 'reason': reason, + 'bulk_action': True + }) + + cosmos_public_workspaces_container.upsert_item(workspace) + + # Log activity + from functions_activity_logging import log_public_workspace_status_change + log_public_workspace_status_change( + workspace_id=workspace_id, + workspace_name=workspace.get('name', 'Unknown'), + old_status=old_status, + new_status=new_status, + changed_by_user_id=admin_user_id, + changed_by_email=admin_email, + reason=f"Bulk action: {reason}" if reason else "Bulk action" + ) - except Exception as e: - error_msg = f"Error during personal documents migration: {str(e)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) + successful.append({ + 'workspace_id': workspace_id, + 'workspace_name': workspace.get('name', 'Unknown'), + 'action': action, + 'old_status': old_status, + 'new_status': new_status + }) + + except Exception as e: + failed.append({ + 'workspace_id': workspace_id, + 'error': str(e) + }) + debug_print(f"Error processing workspace {workspace_id}: {e}") - # Migrate group documents - current_app.logger.info("Starting group documents migration...") - try: - group_docs_query = """ - SELECT * - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - group_docs = list(cosmos_group_documents_container.query_items( - query=group_docs_query, - enable_cross_partition_query=True - )) - - for doc in group_docs: + return jsonify({ + 'message': 'Bulk action completed', + 'successful': successful, + 'failed': failed, + 'summary': { + 'total': len(workspace_ids), + 'success': len(successful), + 'failed': len(failed) + } + }), 200 + + except Exception as e: + debug_print(f"Error performing bulk public workspace action: {e}") + return jsonify({'error': 'Failed to perform bulk action'}), 500 + + @app.route('/api/admin/control-center/public-workspaces/', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_get_public_workspace_details(workspace_id): + """ + Get detailed information about a specific public workspace. + """ + try: + # Get the workspace + workspace = cosmos_public_workspaces_container.read_item( + item=workspace_id, + partition_key=workspace_id + ) + + # Enhance with activity information + enhanced_workspace = enhance_public_workspace_with_activity(workspace) + + return jsonify(enhanced_workspace), 200 + + except Exception as e: + debug_print(f"Error getting public workspace details: {e}") + return jsonify({'error': 'Failed to retrieve workspace details'}), 500 + + + @app.route('/api/admin/control-center/public-workspaces//members', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_get_public_workspace_members(workspace_id): + """ + Get all members of a specific public workspace with their roles. + Returns admins, document managers, and owner information. + """ + try: + # Get the workspace + workspace = cosmos_public_workspaces_container.read_item( + item=workspace_id, + partition_key=workspace_id + ) + + # Create members list with roles + members = [] + + # Add owner - owner is an object with userId, email, displayName + owner = workspace.get('owner') + if owner: + members.append({ + 'userId': owner.get('userId', ''), + 'email': owner.get('email', ''), + 'displayName': owner.get('displayName', owner.get('email', 'Unknown')), + 'role': 'owner' + }) + + # Add admins - admins is an array of objects with userId, email, displayName + admins = workspace.get('admins', []) + for admin in admins: + # Handle both object format and string format (for backward compatibility) + if isinstance(admin, dict): + members.append({ + 'userId': admin.get('userId', ''), + 'email': admin.get('email', ''), + 'displayName': admin.get('displayName', admin.get('email', 'Unknown')), + 'role': 'admin' + }) + else: + # Legacy format where admin is just a userId string try: - # Create activity log directly to preserve original timestamp - activity_log = { - 'id': str(uuid.uuid4()), - 'user_id': doc.get('user_id'), - 'activity_type': 'document_creation', - 'workspace_type': 'group', - 'timestamp': doc.get('upload_date') or datetime.utcnow().isoformat(), - 'created_at': doc.get('upload_date') or datetime.utcnow().isoformat(), - 'document': { - 'document_id': doc.get('id'), - 'file_name': doc.get('file_name', 'Unknown'), - 'file_type': doc.get('file_type', 'unknown'), - 'file_size_bytes': doc.get('file_size', 0), - 'page_count': doc.get('number_of_pages', 0), - 'version': doc.get('version', 1) - }, - 'embedding_usage': { - 'total_tokens': doc.get('embedding_tokens', 0), - 'model_deployment_name': doc.get('embedding_model_deployment_name', 'unknown') - }, - 'document_metadata': { - 'author': doc.get('author'), - 'title': doc.get('title'), - 'subject': doc.get('subject'), - 'publication_date': doc.get('publication_date'), - 'keywords': doc.get('keywords', []), - 'abstract': doc.get('abstract') - }, - 'workspace_context': { - 'group_id': doc.get('group_id') - } - } - - # Save to activity logs container - cosmos_activity_logs_container.upsert_item(activity_log) - - # Add flag to document - doc['added_to_activity_log'] = True - cosmos_group_documents_container.upsert_item(doc) - - results['group_documents_migrated'] += 1 - - except Exception as doc_error: - results['group_documents_failed'] += 1 - error_msg = f"Failed to migrate group document {doc.get('id')}: {str(doc_error)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) - - except Exception as e: - error_msg = f"Error during group documents migration: {str(e)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) + user = cosmos_user_settings_container.read_item( + item=admin, + partition_key=admin + ) + members.append({ + 'userId': admin, + 'email': user.get('email', ''), + 'displayName': user.get('display_name', user.get('email', '')), + 'role': 'admin' + }) + except: + pass - # Migrate public documents - current_app.logger.info("Starting public documents migration...") + # Add document managers - documentManagers is an array of objects with userId, email, displayName + doc_managers = workspace.get('documentManagers', []) + for dm in doc_managers: + # Handle both object format and string format (for backward compatibility) + if isinstance(dm, dict): + members.append({ + 'userId': dm.get('userId', ''), + 'email': dm.get('email', ''), + 'displayName': dm.get('displayName', dm.get('email', 'Unknown')), + 'role': 'documentManager' + }) + else: + # Legacy format where documentManager is just a userId string + try: + user = cosmos_user_settings_container.read_item( + item=dm, + partition_key=dm + ) + members.append({ + 'userId': dm, + 'email': user.get('email', ''), + 'displayName': user.get('display_name', user.get('email', '')), + 'role': 'documentManager' + }) + except: + pass + + return jsonify({ + 'success': True, + 'members': members, + 'workspace_name': workspace.get('name', 'Unknown') + }), 200 + + except Exception as e: + debug_print(f"Error getting workspace members: {e}") + return jsonify({'error': 'Failed to retrieve workspace members'}), 500 + + + @app.route('/api/admin/control-center/public-workspaces//add-member', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_add_workspace_member(workspace_id): + """ + Admin adds a member to a public workspace (used by both single add and CSV bulk upload) + """ + try: + data = request.get_json() + user_id = data.get('userId') + name = data.get('displayName') or data.get('name') + email = data.get('email') + role = data.get('role', 'user').lower() + + if not user_id or not name or not email: + return jsonify({'error': 'Missing required fields: userId, name/displayName, email'}), 400 + + # Validate role + valid_roles = ['admin', 'document_manager', 'user'] + if role not in valid_roles: + return jsonify({'error': f'Invalid role. Must be: {", ".join(valid_roles)}'}), 400 + + admin_user = session.get('user', {}) + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + + # Get the workspace try: - public_docs_query = """ - SELECT * - FROM c - WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false - """ - public_docs = list(cosmos_public_documents_container.query_items( - query=public_docs_query, + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + except: + return jsonify({'error': 'Public workspace not found'}), 404 + + # Check if user already exists + owner = workspace.get('owner', {}) + owner_id = owner.get('userId') if isinstance(owner, dict) else owner + admins = workspace.get('admins', []) + doc_managers = workspace.get('documentManagers', []) + + # Extract user IDs from arrays (handle both object and string formats) + admin_ids = [a.get('userId') if isinstance(a, dict) else a for a in admins] + doc_manager_ids = [dm.get('userId') if isinstance(dm, dict) else dm for dm in doc_managers] + + if user_id == owner_id or user_id in admin_ids or user_id in doc_manager_ids: + return jsonify({ + 'message': f'User {email} already exists in workspace', + 'skipped': True + }), 200 + + # Create full user object + user_obj = { + 'userId': user_id, + 'displayName': name, + 'email': email + } + + # Add to appropriate role array with full user object + if role == 'admin': + workspace.setdefault('admins', []).append(user_obj) + elif role == 'document_manager': + workspace.setdefault('documentManagers', []).append(user_obj) + # Note: 'user' role doesn't have a separate array in public workspaces + # They are implicit members through document access + + # Update modification timestamp + workspace['modifiedDate'] = datetime.utcnow().isoformat() + + # Save workspace + cosmos_public_workspaces_container.upsert_item(workspace) + + # Determine the action source + source = data.get('source', 'csv') + action_type = 'add_workspace_member_directly' if source == 'single' else 'admin_add_workspace_member_csv' + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'activity_type': activity_type, + 'timestamp': datetime.utcnow().isoformat(), + 'admin_user_id': admin_user.get('oid') or admin_user.get('sub'), + 'admin_email': admin_email, + 'workspace_id': workspace_id, + 'workspace_name': workspace.get('name', 'Unknown'), + 'member_user_id': user_id, + 'member_email': email, + 'member_name': name, + 'member_role': role, + 'source': source, + 'description': f"Admin {admin_email} added member {name} ({email}) to workspace {workspace.get('name', workspace_id)} as {role}", + 'workspace_context': { + 'public_workspace_id': workspace_id + } + } + cosmos_activity_logs_container.create_item(body=activity_record) + + # Log to Application Insights + log_event("[ControlCenter] Admin Add Workspace Member", { + "admin_user": admin_email, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "member_email": email, + "member_role": role + }) + + return jsonify({ + 'message': f'Member {email} added successfully', + 'skipped': False + }), 200 + + except Exception as e: + debug_print(f"Error adding workspace member: {e}") + return jsonify({'error': 'Failed to add workspace member'}), 500 + + + @app.route('/api/admin/control-center/public-workspaces//add-member-single', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_add_workspace_member_single(workspace_id): + """ + Admin adds a single member to a public workspace via the Add Member modal + """ + try: + data = request.get_json() + user_id = data.get('userId') + display_name = data.get('displayName') + email = data.get('email') + role = data.get('role', 'document_manager').lower() + + if not user_id or not display_name or not email: + return jsonify({'error': 'Missing required fields: userId, displayName, email'}), 400 + + # Validate role - workspaces only support admin and document_manager + valid_roles = ['admin', 'document_manager'] + if role not in valid_roles: + return jsonify({'error': f'Invalid role. Must be: {", ".join(valid_roles)}'}), 400 + + admin_user = session.get('user', {}) + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + + # Get the workspace + try: + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + except: + return jsonify({'error': 'Public workspace not found'}), 404 + + # Check if user already exists + owner = workspace.get('owner', {}) + owner_id = owner.get('userId') if isinstance(owner, dict) else owner + admins = workspace.get('admins', []) + doc_managers = workspace.get('documentManagers', []) + + # Extract user IDs from arrays (handle both object and string formats) + admin_ids = [a.get('userId') if isinstance(a, dict) else a for a in admins] + doc_manager_ids = [dm.get('userId') if isinstance(dm, dict) else dm for dm in doc_managers] + + if user_id == owner_id or user_id in admin_ids or user_id in doc_manager_ids: + return jsonify({ + 'error': f'User {email} already exists in workspace' + }), 400 + + # Add to appropriate role array with full user info + user_obj = { + 'userId': user_id, + 'displayName': display_name, + 'email': email + } + + if role == 'admin': + workspace.setdefault('admins', []).append(user_obj) + elif role == 'document_manager': + workspace.setdefault('documentManagers', []).append(user_obj) + + # Update modification timestamp + workspace['modifiedDate'] = datetime.utcnow().isoformat() + + # Save workspace + cosmos_public_workspaces_container.upsert_item(workspace) + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'activity_type': 'add_workspace_member_directly', + 'timestamp': datetime.utcnow().isoformat(), + 'admin_user_id': admin_user.get('oid') or admin_user.get('sub'), + 'admin_email': admin_email, + 'workspace_id': workspace_id, + 'workspace_name': workspace.get('name', 'Unknown'), + 'member_user_id': user_id, + 'member_email': email, + 'member_name': display_name, + 'member_role': role, + 'source': 'single', + 'description': f"Admin {admin_email} added member {display_name} ({email}) to workspace {workspace.get('name', workspace_id)} as {role}", + 'workspace_context': { + 'public_workspace_id': workspace_id + } + } + cosmos_activity_logs_container.create_item(body=activity_record) + + # Log to Application Insights + log_event("[ControlCenter] Admin Add Workspace Member (Single)", { + "admin_user": admin_email, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "member_email": email, + "member_role": role + }) + + return jsonify({ + 'message': f'Successfully added {display_name} as {role}', + 'success': True + }), 200 + + except Exception as e: + debug_print(f"Error adding workspace member: {e}") + return jsonify({'error': 'Failed to add workspace member'}), 500 + + + @app.route('/api/admin/control-center/public-workspaces//activity', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_get_public_workspace_activity(workspace_id): + """ + Get activity timeline for a specific public workspace from activity logs + Returns document creation/deletion, member changes, status changes, and conversations + """ + try: + # Get time range filter (default: last 30 days) + days = request.args.get('days', '30') + export = request.args.get('export', 'false').lower() == 'true' + + # Calculate date filter + cutoff_date = None + if days != 'all': + try: + days_int = int(days) + cutoff_date = (datetime.utcnow() - timedelta(days=days_int)).isoformat() + except ValueError: + pass + + time_filter = "AND c.timestamp >= @cutoff_date" if cutoff_date else "" + + # Query: All activities for public workspaces (no activity type filter to show everything) + # Use SELECT * to get complete raw documents for modal display + query = f""" + SELECT * + FROM c + WHERE c.workspace_context.public_workspace_id = @workspace_id + {time_filter} + ORDER BY c.timestamp DESC + """ + + # Log the query for debugging + debug_print(f"[Workspace Activity] Querying for workspace: {workspace_id}, days: {days}") + debug_print(f"[Workspace Activity] Query: {query}") + + parameters = [ + {"name": "@workspace_id", "value": workspace_id} + ] + + if cutoff_date: + parameters.append({"name": "@cutoff_date", "value": cutoff_date}) + + debug_print(f"[Workspace Activity] Parameters: {parameters}") + + # Execute query + activities = list(cosmos_activity_logs_container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + debug_print(f"[Workspace Activity] Query returned {len(activities)} activities") + + # Format activities for timeline display + formatted_activities = [] + for activity in activities: + formatted = { + 'id': activity.get('id'), + 'type': activity.get('activity_type'), + 'timestamp': activity.get('timestamp'), + 'user_id': activity.get('user_id'), + 'description': activity.get('description', '') + } + + # Add type-specific details + activity_type = activity.get('activity_type') + + if activity_type == 'document_creation': + doc = activity.get('document', {}) + formatted['document'] = { + 'file_name': doc.get('file_name'), + 'file_type': doc.get('file_type'), + 'file_size_bytes': doc.get('file_size_bytes'), + 'page_count': doc.get('page_count') + } + formatted['icon'] = 'file-earmark-plus' + formatted['color'] = 'success' + + elif activity_type == 'document_deletion': + doc = activity.get('document', {}) + formatted['document'] = { + 'file_name': doc.get('file_name'), + 'file_type': doc.get('file_type') + } + formatted['icon'] = 'file-earmark-minus' + formatted['color'] = 'danger' + + elif activity_type == 'document_metadata_update': + doc = activity.get('document', {}) + formatted['document'] = { + 'file_name': doc.get('file_name') + } + formatted['icon'] = 'pencil-square' + formatted['color'] = 'info' + + elif activity_type == 'public_workspace_status_change': + status_change = activity.get('status_change', {}) + formatted['status_change'] = { + 'from_status': status_change.get('old_status'), + 'to_status': status_change.get('new_status'), + 'changed_by': activity.get('changed_by') + } + formatted['icon'] = 'shield-check' + formatted['color'] = 'warning' + + elif activity_type == 'token_usage': + usage = activity.get('usage', {}) + formatted['token_usage'] = { + 'total_tokens': usage.get('total_tokens'), + 'prompt_tokens': usage.get('prompt_tokens'), + 'completion_tokens': usage.get('completion_tokens'), + 'model': usage.get('model'), + 'token_type': activity.get('token_type') # 'chat' or 'embedding' + } + # Add chat details if available + chat_details = activity.get('chat_details', {}) + if chat_details: + formatted['token_usage']['conversation_id'] = chat_details.get('conversation_id') + formatted['token_usage']['message_id'] = chat_details.get('message_id') + # Add embedding details if available + embedding_details = activity.get('embedding_details', {}) + if embedding_details: + formatted['token_usage']['document_id'] = embedding_details.get('document_id') + formatted['token_usage']['file_name'] = embedding_details.get('file_name') + formatted['icon'] = 'cpu' + formatted['color'] = 'info' + + else: + # Fallback for unknown activity types - still show them! + formatted['icon'] = 'circle' + formatted['color'] = 'secondary' + # Keep any additional data that might be in the activity + if activity.get('status_change'): + formatted['status_change'] = activity.get('status_change') + if activity.get('document'): + formatted['document'] = activity.get('document') + if activity.get('workspace_context'): + formatted['workspace_context'] = activity.get('workspace_context') + + formatted_activities.append(formatted) + + if export: + # Return CSV for export + import io + import csv + output = io.StringIO() + writer = csv.writer(output) + writer.writerow(['Timestamp', 'Type', 'User ID', 'Description', 'Details']) + for activity in formatted_activities: + details = '' + if activity.get('document'): + doc = activity['document'] + details = f"{doc.get('file_name', '')} - {doc.get('file_type', '')}" + elif activity.get('status_change'): + sc = activity['status_change'] + details = f"{sc.get('from_status', '')} -> {sc.get('to_status', '')}" + + writer.writerow([ + activity['timestamp'], + activity['type'], + activity['user_id'], + activity['description'], + details + ]) + + csv_content = output.getvalue() + output.close() + + from flask import make_response + response = make_response(csv_content) + response.headers['Content-Type'] = 'text/csv' + response.headers['Content-Disposition'] = f'attachment; filename="workspace_{workspace_id}_activity.csv"' + return response + + return jsonify({ + 'success': True, + 'activities': formatted_activities, + 'raw_activities': activities # Include raw activities for modal display + }), 200 + + except Exception as e: + debug_print(f"Error getting workspace activity: {e}") + import traceback + traceback.print_exc() + return jsonify({'error': 'Failed to retrieve workspace activity'}), 500 + + + @app.route('/api/admin/control-center/public-workspaces//take-ownership', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_take_workspace_ownership(workspace_id): + """ + Create an approval request for admin to take ownership of a public workspace. + Requires approval from workspace owner or another admin. + + Body: + reason (str): Explanation for taking ownership (required) + """ + try: + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) + + if not admin_user_id: + return jsonify({'error': 'Could not identify admin user'}), 400 + + # Get request body + data = request.get_json() or {} + reason = data.get('reason', '').strip() + + if not reason: + return jsonify({'error': 'Reason is required for ownership transfer'}), 400 + + # Validate workspace exists + try: + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + except: + return jsonify({'error': 'Workspace not found'}), 404 + + # Get old owner info + old_owner = workspace.get('owner', {}) + if isinstance(old_owner, dict): + old_owner_id = old_owner.get('userId') + old_owner_email = old_owner.get('email') + else: + old_owner_id = old_owner + old_owner_email = 'unknown' + + # Create approval request (use group_id parameter as partition key for workspace) + approval = create_approval_request( + request_type=TYPE_TAKE_OWNERSHIP, + group_id=workspace_id, + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'old_owner_id': old_owner_id, + 'old_owner_email': old_owner_email, + 'entity_type': 'workspace' + } + ) + + # Log event + log_event("[ControlCenter] Take Workspace Ownership Request Created", { + "admin_user": admin_email, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "approval_id": approval['id'], + "reason": reason + }) + + return jsonify({ + 'success': True, + 'message': 'Ownership transfer request created and pending approval', + 'approval_id': approval['id'], + 'requires_approval': True, + 'status': 'pending' + }), 201 + + except Exception as e: + debug_print(f"Error creating take workspace ownership request: {e}") + import traceback + traceback.print_exc() + return jsonify({'error': str(e)}), 500 + + @app.route('/api/admin/control-center/public-workspaces//ownership', methods=['PUT']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_update_public_workspace_ownership(workspace_id): + """ + Create an approval request to transfer public workspace ownership to another member. + Requires approval from workspace owner or another admin. + + Body: + newOwnerId (str): User ID of the new owner (required) + reason (str): Explanation for ownership transfer (required) + """ + try: + data = request.get_json() + new_owner_user_id = data.get('newOwnerId') + reason = data.get('reason', '').strip() + + if not new_owner_user_id: + return jsonify({'error': 'Missing newOwnerId'}), 400 + + if not reason: + return jsonify({'error': 'Reason is required for ownership transfer'}), 400 + + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) + + # Get the workspace + try: + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + except: + return jsonify({'error': 'Workspace not found'}), 404 + + # Get new owner user details + try: + new_owner_user = cosmos_user_settings_container.read_item( + item=new_owner_user_id, + partition_key=new_owner_user_id + ) + new_owner_email = new_owner_user.get('email', 'unknown') + new_owner_name = new_owner_user.get('display_name', new_owner_email) + except: + return jsonify({'error': 'New owner user not found'}), 404 + + # Check if new owner is a member of the workspace + is_member = False + current_owner = workspace.get('owner', {}) + if isinstance(current_owner, dict): + if current_owner.get('userId') == new_owner_user_id: + is_member = True + elif current_owner == new_owner_user_id: + is_member = True + + # Check admins + for admin in workspace.get('admins', []): + admin_id = admin.get('userId') if isinstance(admin, dict) else admin + if admin_id == new_owner_user_id: + is_member = True + break + + # Check documentManagers + if not is_member: + for dm in workspace.get('documentManagers', []): + dm_id = dm.get('userId') if isinstance(dm, dict) else dm + if dm_id == new_owner_user_id: + is_member = True + break + + if not is_member: + return jsonify({'error': 'Selected user is not a member of this workspace'}), 400 + + # Get old owner info + old_owner_id = None + old_owner_email = None + if isinstance(current_owner, dict): + old_owner_id = current_owner.get('userId') + old_owner_email = current_owner.get('email') + else: + old_owner_id = current_owner + + # Create approval request (use group_id parameter as partition key for workspace) + approval = create_approval_request( + request_type=TYPE_TRANSFER_OWNERSHIP, + group_id=workspace_id, + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'new_owner_id': new_owner_user_id, + 'new_owner_email': new_owner_email, + 'new_owner_name': new_owner_name, + 'old_owner_id': old_owner_id, + 'old_owner_email': old_owner_email, + 'entity_type': 'workspace' + } + ) + + # Log event + log_event("[ControlCenter] Transfer Workspace Ownership Request Created", { + "admin_user": admin_email, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "new_owner": new_owner_email, + "old_owner_id": old_owner_id, + "approval_id": approval['id'], + "reason": reason + }) + + return jsonify({ + 'message': 'Ownership transfer approval request created', + 'approval_id': approval['id'], + 'requires_approval': True + }), 201 + + except Exception as e: + debug_print(f"Error creating workspace ownership transfer request: {e}") + import traceback + traceback.print_exc() + return jsonify({'error': 'Failed to create ownership transfer request'}), 500 + + + @app.route('/api/admin/control-center/public-workspaces//documents', methods=['DELETE']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_delete_public_workspace_documents_admin(workspace_id): + """ + Create an approval request to delete all documents in a public workspace. + Requires approval from workspace owner or another admin. + + Body: + reason (str): Explanation for deleting documents (required) + """ + try: + data = request.get_json() or {} + reason = data.get('reason', '').strip() + + if not reason: + return jsonify({'error': 'Reason is required for document deletion'}), 400 + + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) + + # Validate workspace exists + try: + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + except: + return jsonify({'error': 'Public workspace not found'}), 404 + + # Create approval request + approval = create_approval_request( + request_type=TYPE_DELETE_DOCUMENTS, + group_id=workspace_id, # Use workspace_id as group_id for approval system + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'workspace_name': workspace.get('name'), + 'entity_type': 'workspace' + } + ) + + # Log event + log_event("[ControlCenter] Delete Public Workspace Documents Request Created", { + "admin_user": admin_email, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "approval_id": approval['id'], + "reason": reason + }) + + return jsonify({ + 'success': True, + 'message': 'Document deletion request created and pending approval', + 'approval_id': approval['id'], + 'status': 'pending' + }), 200 + + except Exception as e: + debug_print(f"Error creating document deletion request: {e}") + return jsonify({'error': str(e)}), 500 + + + @app.route('/api/admin/control-center/public-workspaces/', methods=['DELETE']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_delete_public_workspace_admin(workspace_id): + """ + Create an approval request to delete an entire public workspace. + Requires approval from workspace owner or another admin. + + Body: + reason (str): Explanation for deleting the workspace (required) + """ + try: + data = request.get_json() or {} + reason = data.get('reason', '').strip() + + if not reason: + return jsonify({'error': 'Reason is required for workspace deletion'}), 400 + + admin_user = session.get('user', {}) + admin_user_id = admin_user.get('oid') or admin_user.get('sub') + admin_email = admin_user.get('preferred_username', admin_user.get('email', 'unknown')) + admin_display_name = admin_user.get('name', admin_email) + + # Validate workspace exists + try: + workspace = cosmos_public_workspaces_container.read_item( + item=workspace_id, + partition_key=workspace_id + ) + except: + return jsonify({'error': 'Public workspace not found'}), 404 + + # Create approval request + approval = create_approval_request( + request_type=TYPE_DELETE_GROUP, # Reuse TYPE_DELETE_GROUP for workspace deletion + group_id=workspace_id, # Use workspace_id as group_id for approval system + requester_id=admin_user_id, + requester_email=admin_email, + requester_name=admin_display_name, + reason=reason, + metadata={ + 'workspace_name': workspace.get('name'), + 'entity_type': 'workspace' + } + ) + + # Log event + log_event("[ControlCenter] Delete Public Workspace Request Created", { + "admin_user": admin_email, + "workspace_id": workspace_id, + "workspace_name": workspace.get('name'), + "approval_id": approval['id'], + "reason": reason + }) + + return jsonify({ + 'success': True, + 'message': 'Workspace deletion request created and pending approval', + 'approval_id': approval['id'], + 'status': 'pending' + }), 200 + + except Exception as e: + debug_print(f"Error creating workspace deletion request: {e}") + return jsonify({'error': str(e)}), 500 + + # Activity Trends API + @app.route('/api/admin/control-center/activity-trends', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('dashboard') + def api_get_activity_trends(): + """ + Get activity trends data for the control center dashboard. + Returns aggregated activity data from various containers. + """ + try: + # Check if custom start_date and end_date are provided + custom_start = request.args.get('start_date') + custom_end = request.args.get('end_date') + + if custom_start and custom_end: + # Use custom date range + try: + start_date = datetime.fromisoformat(custom_start).replace(hour=0, minute=0, second=0, microsecond=0) + end_date = datetime.fromisoformat(custom_end).replace(hour=23, minute=59, second=59, microsecond=999999) + days = (end_date - start_date).days + 1 + debug_print(f"🔍 [Activity Trends API] Custom date range: {start_date} to {end_date} ({days} days)") + except ValueError: + return jsonify({'error': 'Invalid date format. Use YYYY-MM-DD format.'}), 400 + else: + # Use days parameter (default behavior) + days = int(request.args.get('days', 7)) + # Set end_date to end of current day to include all of today's records + end_date = datetime.now().replace(hour=23, minute=59, second=59, microsecond=999999) + start_date = (end_date - timedelta(days=days)).replace(hour=0, minute=0, second=0, microsecond=0) + debug_print(f"🔍 [Activity Trends API] Request for {days} days: {start_date} to {end_date}") + + # Get activity data + activity_data = get_activity_trends_data(start_date, end_date) + + debug_print(f"🔍 [Activity Trends API] Returning data: {activity_data}") + + return jsonify({ + 'success': True, + 'activity_data': activity_data, + 'period': f"{days} days", + 'start_date': start_date.isoformat(), + 'end_date': end_date.isoformat() + }) + + except Exception as e: + debug_print(f"Error getting activity trends: {e}") + print(f"❌ [Activity Trends API] Error: {e}") + return jsonify({'error': 'Failed to retrieve activity trends'}), 500 + + + + @app.route('/api/admin/control-center/activity-trends/export', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('dashboard') + def api_export_activity_trends(): + """ + Export activity trends raw data as CSV file based on selected charts and date range. + Returns detailed records with user information instead of aggregated counts. + """ + try: + debug_print("🔍 [ACTIVITY TRENDS DEBUG] Starting CSV export process") + data = request.get_json() + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Request data: {data}") # Parse request parameters + charts = data.get('charts', ['logins', 'chats', 'documents']) # Default to all charts + time_window = data.get('time_window', '30') # Default to 30 days + start_date = data.get('start_date') # For custom range + end_date = data.get('end_date') # For custom range + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Parsed params - charts: {charts}, time_window: {time_window}, start_date: {start_date}, end_date: {end_date}") # Determine date range + debug_print("🔍 [ACTIVITY TRENDS DEBUG] Determining date range") + if time_window == 'custom' and start_date and end_date: + try: + debug_print("🔍 [ACTIVITY TRENDS DEBUG] Processing custom dates: {start_date} to {end_date}") + start_date_obj = datetime.fromisoformat(start_date.replace('Z', '+00:00') if 'Z' in start_date else start_date) + end_date_obj = datetime.fromisoformat(end_date.replace('Z', '+00:00') if 'Z' in end_date else end_date) + end_date_obj = end_date_obj.replace(hour=23, minute=59, second=59, microsecond=999999) + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Custom date objects created: {start_date_obj} to {end_date_obj}") + except ValueError as ve: + print(f"❌ [ACTIVITY TRENDS DEBUG] Date parsing error: {ve}") + return jsonify({'error': 'Invalid date format'}), 400 + else: + # Use predefined ranges + days = int(time_window) if time_window.isdigit() else 30 + end_date_obj = datetime.now().replace(hour=23, minute=59, second=59, microsecond=999999) + start_date_obj = end_date_obj - timedelta(days=days-1) + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Predefined range: {days} days, from {start_date_obj} to {end_date_obj}") + + # Get raw activity data using new function + debug_print("🔍 [ACTIVITY TRENDS DEBUG] Calling get_raw_activity_trends_data") + raw_data = get_raw_activity_trends_data( + start_date_obj, + end_date_obj, + charts + ) + debug_print(f"🔍 [ACTIVITY TRENDS DEBUG] Raw data retrieved: {len(raw_data) if raw_data else 0} chart types") + + # Generate CSV content with all data types + import io + import csv + output = io.StringIO() + writer = csv.writer(output) + + # Write data for each chart type + debug_print(f"🔍 [CSV DEBUG] Processing {len(charts)} chart types: {charts}") + for chart_type in charts: + debug_print(f"🔍 [CSV DEBUG] Processing chart type: {chart_type}") + if chart_type in raw_data and raw_data[chart_type]: + debug_print(f"🔍 [CSV DEBUG] Found {len(raw_data[chart_type])} records for {chart_type}") + # Add section header + writer.writerow([]) # Empty row for separation + section_header = f"=== {chart_type.upper()} DATA ===" + debug_print(f"🔍 [CSV DEBUG] Writing section header: {section_header}") + writer.writerow([section_header]) + + # Write headers and data based on chart type + if chart_type == 'logins': + debug_print(f"🔍 [CSV DEBUG] Writing login headers for {chart_type}") + writer.writerow(['Display Name', 'Email', 'User ID', 'Login Time']) + record_count = 0 + for record in raw_data[chart_type]: + record_count += 1 + if record_count <= 3: # Debug first 3 records + debug_print(f"🔍 [CSV DEBUG] Login record {record_count} structure: {list(record.keys())}") + debug_print(f"🔍 [CSV DEBUG] Login record {record_count} data: {record}") + writer.writerow([ + record.get('display_name', ''), + record.get('email', ''), + record.get('user_id', ''), + record.get('login_time', '') + ]) + debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} login records") + + elif chart_type in ['documents', 'personal_documents', 'group_documents', 'public_documents']: + # Handle all document types with same structure + debug_print(f"🔍 [CSV DEBUG] Writing document headers for {chart_type}") + writer.writerow([ + 'Display Name', 'Email', 'User ID', 'Document ID', 'Document Filename', + 'Document Title', 'Document Page Count', 'Document Size in AI Search', + 'Document Size in Storage Account', 'Upload Date', 'Document Type' + ]) + record_count = 0 + for record in raw_data[chart_type]: + record_count += 1 + if record_count <= 3: # Log first 3 records for debugging + debug_print(f"🔍 [CSV DEBUG] Writing {chart_type} record {record_count}: {record.get('filename', 'No filename')}") + writer.writerow([ + record.get('display_name', ''), + record.get('email', ''), + record.get('user_id', ''), + record.get('document_id', ''), + record.get('filename', ''), + record.get('title', ''), + record.get('page_count', ''), + record.get('ai_search_size', ''), + record.get('storage_account_size', ''), + record.get('upload_date', ''), + record.get('document_type', chart_type.replace('_documents', '').title()) + ]) + debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} records for {chart_type}") + + elif chart_type == 'chats': + debug_print(f"🔍 [CSV DEBUG] Writing chat headers for {chart_type}") + writer.writerow([ + 'Display Name', 'Email', 'User ID', 'Chat ID', 'Chat Title', + 'Number of Messages', 'Total Size (characters)', 'Created Date' + ]) + record_count = 0 + for record in raw_data[chart_type]: + record_count += 1 + if record_count <= 3: # Debug first 3 records + debug_print(f"🔍 [CSV DEBUG] Chat record {record_count} structure: {list(record.keys())}") + debug_print(f"🔍 [CSV DEBUG] Chat record {record_count} data: {record}") + writer.writerow([ + record.get('display_name', ''), + record.get('email', ''), + record.get('user_id', ''), + record.get('chat_id', ''), + record.get('chat_title', ''), + record.get('message_count', ''), + record.get('total_size', ''), + record.get('created_date', '') + ]) + debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} chat records") + + elif chart_type == 'tokens': + debug_print(f"🔍 [CSV DEBUG] Writing token usage headers for {chart_type}") + writer.writerow([ + 'Display Name', 'Email', 'User ID', 'Token Type', 'Model Name', + 'Prompt Tokens', 'Completion Tokens', 'Total Tokens', 'Timestamp' + ]) + record_count = 0 + for record in raw_data[chart_type]: + record_count += 1 + if record_count <= 3: # Debug first 3 records + debug_print(f"🔍 [CSV DEBUG] Token record {record_count} structure: {list(record.keys())}") + debug_print(f"🔍 [CSV DEBUG] Token record {record_count} data: {record}") + writer.writerow([ + record.get('display_name', ''), + record.get('email', ''), + record.get('user_id', ''), + record.get('token_type', ''), + record.get('model_name', ''), + record.get('prompt_tokens', ''), + record.get('completion_tokens', ''), + record.get('total_tokens', ''), + record.get('timestamp', '') + ]) + debug_print(f"🔍 [CSV DEBUG] Finished writing {record_count} token usage records") + else: + debug_print(f"🔍 [CSV DEBUG] No data found for {chart_type} - available keys: {list(raw_data.keys()) if raw_data else 'None'}") + + # Add final debug info + debug_print(f"🔍 [CSV DEBUG] Finished processing all chart types. Raw data summary:") + for key, value in raw_data.items(): + if isinstance(value, list): + debug_print(f"🔍 [CSV DEBUG] - {key}: {len(value)} records") + else: + debug_print(f"🔍 [CSV DEBUG] - {key}: {type(value)} - {value}") + + csv_content = output.getvalue() + debug_print(f"🔍 [CSV DEBUG] Generated CSV content length: {len(csv_content)} characters") + debug_print(f"🔍 [CSV DEBUG] CSV content preview (first 500 chars): {csv_content[:500]}") + output.close() + + # Generate filename with timestamp + timestamp = datetime.now().strftime('%Y%m%d_%H%M%S') + filename = f"activity_trends_raw_export_{timestamp}.csv" + + # Return CSV as downloadable response + from flask import make_response + response = make_response(csv_content) + response.headers['Content-Type'] = 'text/csv' + response.headers['Content-Disposition'] = f'attachment; filename="{filename}"' + + return response + + except Exception as e: + debug_print(f"Error exporting activity trends: {e}") + return jsonify({'error': 'Failed to export data'}), 500 + + @app.route('/api/admin/control-center/activity-trends/chat', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('dashboard') + def api_chat_activity_trends(): + """ + Create a new chat conversation with activity trends data as CSV message. + """ + try: + data = request.get_json() + + # Parse request parameters + charts = data.get('charts', ['logins', 'chats', 'documents']) # Default to all charts + time_window = data.get('time_window', '30') # Default to 30 days + start_date = data.get('start_date') # For custom range + end_date = data.get('end_date') # For custom range + + # Determine date range + if time_window == 'custom' and start_date and end_date: + try: + start_date_obj = datetime.fromisoformat(start_date.replace('Z', '+00:00') if 'Z' in start_date else start_date) + end_date_obj = datetime.fromisoformat(end_date.replace('Z', '+00:00') if 'Z' in end_date else end_date) + end_date_obj = end_date_obj.replace(hour=23, minute=59, second=59, microsecond=999999) + except ValueError: + return jsonify({'error': 'Invalid date format'}), 400 + else: + # Use predefined ranges + days = int(time_window) if time_window.isdigit() else 30 + end_date_obj = datetime.now().replace(hour=23, minute=59, second=59, microsecond=999999) + start_date_obj = end_date_obj - timedelta(days=days-1) + + # Get activity data using existing function + activity_data = get_activity_trends_data( + start_date_obj.strftime('%Y-%m-%d'), + end_date_obj.strftime('%Y-%m-%d') + ) + + # Prepare CSV data + csv_rows = [] + csv_rows.append(['Date', 'Chart Type', 'Activity Count']) + + # Process each requested chart type + for chart_type in charts: + if chart_type in activity_data: + chart_data = activity_data[chart_type] + # Sort dates for consistent output + sorted_dates = sorted(chart_data.keys()) + + for date_key in sorted_dates: + count = chart_data[date_key] + chart_display_name = { + 'logins': 'Logins', + 'chats': 'Chats', + 'documents': 'Documents', + 'personal_documents': 'Personal Documents', + 'group_documents': 'Group Documents', + 'public_documents': 'Public Documents' + }.get(chart_type, chart_type.title()) + + csv_rows.append([date_key, chart_display_name, count]) + + # Generate CSV content + import io + import csv + output = io.StringIO() + writer = csv.writer(output) + writer.writerows(csv_rows) + csv_content = output.getvalue() + output.close() + + # Get current user info + user_id = session.get('user_id') + user_email = session.get('email') + user_display_name = session.get('display_name', user_email) + + if not user_id: + return jsonify({'error': 'User not authenticated'}), 401 + + # Create new conversation + conversation_id = str(uuid.uuid4()) + timestamp = datetime.now(timezone.utc).isoformat() + + # Generate descriptive title with date range + if time_window == 'custom': + date_range = f"{start_date} to {end_date}" + else: + date_range = f"Last {time_window} Days" + + charts_text = ", ".join([c.title() for c in charts]) + conversation_title = f"Activity Trends - {charts_text} ({date_range})" + + # Create conversation document + conversation_doc = { + "id": conversation_id, + "title": conversation_title, + "user_id": user_id, + "user_email": user_email, + "user_display_name": user_display_name, + "created": timestamp, + "last_updated": timestamp, + "messages": [], + "system_message": "You are analyzing activity trends data from a control center dashboard. The user has provided activity data as a CSV file. Please analyze the data and provide insights about user activity patterns, trends, and any notable observations.", + "message_count": 0, + "settings": { + "model": "gpt-4o", + "temperature": 0.7, + "max_tokens": 4000 + } + } + + # Create the initial message with CSV data (simulate file upload) + message_id = str(uuid.uuid4()) + csv_filename = f"activity_trends_{datetime.now().strftime('%Y%m%d_%H%M%S')}.csv" + + # Create message with file attachment structure + initial_message = { + "id": message_id, + "role": "user", + "content": f"Please analyze this activity trends data from our system dashboard. The data covers {date_range} and includes {charts_text} activity.", + "timestamp": timestamp, + "files": [{ + "name": csv_filename, + "type": "text/csv", + "size": len(csv_content.encode('utf-8')), + "content": csv_content, + "id": str(uuid.uuid4()) + }] + } + + conversation_doc["messages"].append(initial_message) + conversation_doc["message_count"] = 1 + + # Save conversation to database + cosmos_conversations_container.create_item(conversation_doc) + + # Log the activity + log_event("[ControlCenter] Activity Trends Chat Created", { + "conversation_id": conversation_id, + "user_id": user_id, + "charts": charts, + "time_window": time_window, + "date_range": date_range + }) + + return jsonify({ + 'success': True, + 'conversation_id': conversation_id, + 'conversation_title': conversation_title, + 'redirect_url': f'/chat/{conversation_id}' + }), 200 + + except Exception as e: + debug_print(f"Error creating activity trends chat: {e}") + return jsonify({'error': 'Failed to create chat conversation'}), 500 + + # Data Refresh API + @app.route('/api/admin/control-center/refresh', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_refresh_control_center_data(): + """ + Refresh all Control Center metrics data and update admin timestamp. + This will recalculate all user metrics and cache them in user settings. + """ + try: + debug_print("🔄 [REFRESH DEBUG] Starting Control Center data refresh...") + debug_print("Starting Control Center data refresh...") + + # Check if request has specific user_id + from flask import request + try: + request_data = request.get_json(force=True) or {} + except: + # Handle case where no JSON body is sent + request_data = {} + + specific_user_id = request_data.get('user_id') + force_refresh = request_data.get('force_refresh', False) + + debug_print(f"🔄 [REFRESH DEBUG] Request data: user_id={specific_user_id}, force_refresh={force_refresh}") + + # Get all users to refresh their metrics + debug_print("🔄 [REFRESH DEBUG] Querying all users...") + users_query = "SELECT c.id, c.email, c.display_name, c.lastUpdated, c.settings FROM c" + all_users = list(cosmos_user_settings_container.query_items( + query=users_query, + enable_cross_partition_query=True + )) + debug_print(f"🔄 [REFRESH DEBUG] Found {len(all_users)} users to process") + + refreshed_count = 0 + failed_count = 0 + + # Refresh metrics for each user + debug_print("🔄 [REFRESH DEBUG] Starting user refresh loop...") + for user in all_users: + try: + user_id = user.get('id') + debug_print(f"🔄 [REFRESH DEBUG] Processing user {user_id}") + + # Force refresh of metrics for this user + enhanced_user = enhance_user_with_activity(user, force_refresh=True) + refreshed_count += 1 + + debug_print(f"✅ [REFRESH DEBUG] Successfully refreshed user {user_id}") + debug_print(f"Refreshed metrics for user {user_id}") + except Exception as user_error: + failed_count += 1 + debug_print(f"❌ [REFRESH DEBUG] Failed to refresh user {user.get('id')}: {user_error}") + debug_print(f"❌ [REFRESH DEBUG] User error traceback:") + import traceback + debug_print(traceback.format_exc()) + debug_print(f"Failed to refresh metrics for user {user.get('id')}: {user_error}") + + debug_print(f"🔄 [REFRESH DEBUG] User refresh loop completed. Refreshed: {refreshed_count}, Failed: {failed_count}") + + # Refresh metrics for all groups + debug_print("🔄 [REFRESH DEBUG] Starting group refresh...") + groups_refreshed_count = 0 + groups_failed_count = 0 + + try: + groups_query = "SELECT * FROM c" + all_groups = list(cosmos_groups_container.query_items( + query=groups_query, + enable_cross_partition_query=True + )) + debug_print(f"🔄 [REFRESH DEBUG] Found {len(all_groups)} groups to process") + + # Refresh metrics for each group + for group in all_groups: + try: + group_id = group.get('id') + debug_print(f"🔄 [REFRESH DEBUG] Processing group {group_id}") + + # Force refresh of metrics for this group + enhanced_group = enhance_group_with_activity(group, force_refresh=True) + groups_refreshed_count += 1 + + debug_print(f"✅ [REFRESH DEBUG] Successfully refreshed group {group_id}") + debug_print(f"Refreshed metrics for group {group_id}") + except Exception as group_error: + groups_failed_count += 1 + debug_print(f"❌ [REFRESH DEBUG] Failed to refresh group {group.get('id')}: {group_error}") + debug_print(f"❌ [REFRESH DEBUG] Group error traceback:") + import traceback + debug_print(traceback.format_exc()) + debug_print(f"Failed to refresh metrics for group {group.get('id')}: {group_error}") + + except Exception as groups_error: + debug_print(f"❌ [REFRESH DEBUG] Error querying groups: {groups_error}") + debug_print(f"Error querying groups for refresh: {groups_error}") + + debug_print(f"🔄 [REFRESH DEBUG] Group refresh loop completed. Refreshed: {groups_refreshed_count}, Failed: {groups_failed_count}") + + # Update admin settings with refresh timestamp + debug_print("🔄 [REFRESH DEBUG] Updating admin settings...") + try: + from functions_settings import get_settings, update_settings + + settings = get_settings() + if settings: + settings['control_center_last_refresh'] = datetime.now(timezone.utc).isoformat() + update_success = update_settings(settings) + + if not update_success: + debug_print("⚠️ [REFRESH DEBUG] Failed to update admin settings") + debug_print("Failed to update admin settings with refresh timestamp") + else: + debug_print("✅ [REFRESH DEBUG] Admin settings updated successfully") + debug_print("Updated admin settings with refresh timestamp") + else: + debug_print("⚠️ [REFRESH DEBUG] Could not get admin settings") + + except Exception as admin_error: + debug_print(f"❌ [REFRESH DEBUG] Admin settings update failed: {admin_error}") + debug_print(f"Error updating admin settings: {admin_error}") + + debug_print(f"🎉 [REFRESH DEBUG] Refresh completed! Users - Refreshed: {refreshed_count}, Failed: {failed_count}. Groups - Refreshed: {groups_refreshed_count}, Failed: {groups_failed_count}") + debug_print(f"Control Center data refresh completed. Users: {refreshed_count} refreshed, {failed_count} failed. Groups: {groups_refreshed_count} refreshed, {groups_failed_count} failed") + + return jsonify({ + 'success': True, + 'message': 'Control Center data refreshed successfully', + 'refreshed_users': refreshed_count, + 'failed_users': failed_count, + 'refreshed_groups': groups_refreshed_count, + 'failed_groups': groups_failed_count, + 'refresh_timestamp': datetime.now(timezone.utc).isoformat() + }), 200 + + except Exception as e: + debug_print(f"💥 [REFRESH DEBUG] MAJOR ERROR in refresh endpoint: {e}") + debug_print("💥 [REFRESH DEBUG] Full traceback:") + import traceback + debug_print(traceback.format_exc()) + debug_print(f"Error refreshing Control Center data: {e}") + return jsonify({'error': 'Failed to refresh data'}), 500 + + # Get refresh status API + @app.route('/api/admin/control-center/refresh-status', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_get_refresh_status(): + """ + Get the last refresh timestamp for Control Center data. + """ + try: + from functions_settings import get_settings + + settings = get_settings() + last_refresh = settings.get('control_center_last_refresh') + + return jsonify({ + 'last_refresh': last_refresh, + 'last_refresh_formatted': None if not last_refresh else datetime.fromisoformat(last_refresh.replace('Z', '+00:00') if 'Z' in last_refresh else last_refresh).strftime('%m/%d/%Y %I:%M %p UTC') + }), 200 + + except Exception as e: + debug_print(f"Error getting refresh status: {e}") + return jsonify({'error': 'Failed to get refresh status'}), 500 + + # Activity Log Migration APIs + @app.route('/api/admin/control-center/migrate/status', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_get_migration_status(): + """ + Check if there are conversations and documents that need to be migrated to activity logs. + Returns counts of records without the 'added_to_activity_log' flag. + """ + try: + migration_status = { + 'conversations_without_logs': 0, + 'personal_documents_without_logs': 0, + 'group_documents_without_logs': 0, + 'public_documents_without_logs': 0, + 'total_documents_without_logs': 0, + 'migration_needed': False, + 'estimated_total_records': 0 + } + + # Check conversations without the flag + try: + conversations_query = """ + SELECT VALUE COUNT(1) + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + conversations_result = list(cosmos_conversations_container.query_items( + query=conversations_query, + enable_cross_partition_query=True + )) + migration_status['conversations_without_logs'] = conversations_result[0] if conversations_result else 0 + except Exception as e: + debug_print(f"Error checking conversations migration status: {e}") + + # Check personal documents without the flag + try: + personal_docs_query = """ + SELECT VALUE COUNT(1) + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + personal_docs_result = list(cosmos_user_documents_container.query_items( + query=personal_docs_query, + enable_cross_partition_query=True + )) + migration_status['personal_documents_without_logs'] = personal_docs_result[0] if personal_docs_result else 0 + except Exception as e: + debug_print(f"Error checking personal documents migration status: {e}") + + # Check group documents without the flag + try: + group_docs_query = """ + SELECT VALUE COUNT(1) + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + group_docs_result = list(cosmos_group_documents_container.query_items( + query=group_docs_query, + enable_cross_partition_query=True + )) + migration_status['group_documents_without_logs'] = group_docs_result[0] if group_docs_result else 0 + except Exception as e: + debug_print(f"Error checking group documents migration status: {e}") + + # Check public documents without the flag + try: + public_docs_query = """ + SELECT VALUE COUNT(1) + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + public_docs_result = list(cosmos_public_documents_container.query_items( + query=public_docs_query, + enable_cross_partition_query=True + )) + migration_status['public_documents_without_logs'] = public_docs_result[0] if public_docs_result else 0 + except Exception as e: + debug_print(f"Error checking public documents migration status: {e}") + + # Calculate totals + migration_status['total_documents_without_logs'] = ( + migration_status['personal_documents_without_logs'] + + migration_status['group_documents_without_logs'] + + migration_status['public_documents_without_logs'] + ) + + migration_status['estimated_total_records'] = ( + migration_status['conversations_without_logs'] + + migration_status['total_documents_without_logs'] + ) + + migration_status['migration_needed'] = migration_status['estimated_total_records'] > 0 + + return jsonify(migration_status), 200 + + except Exception as e: + debug_print(f"Error getting migration status: {e}") + return jsonify({'error': 'Failed to get migration status'}), 500 + + @app.route('/api/admin/control-center/migrate/all', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_migrate_to_activity_logs(): + """ + Migrate all conversations and documents without activity logs. + This adds activity log records and sets the 'added_to_activity_log' flag. + + WARNING: This may take a while for large datasets and could impact performance. + Recommended to run during off-peak hours. + """ + try: + from functions_activity_logging import log_conversation_creation, log_document_creation_transaction + + results = { + 'conversations_migrated': 0, + 'conversations_failed': 0, + 'personal_documents_migrated': 0, + 'personal_documents_failed': 0, + 'group_documents_migrated': 0, + 'group_documents_failed': 0, + 'public_documents_migrated': 0, + 'public_documents_failed': 0, + 'total_migrated': 0, + 'total_failed': 0, + 'errors': [] + } + + # Migrate conversations + debug_print("Starting conversation migration...") + try: + conversations_query = """ + SELECT * + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + conversations = list(cosmos_conversations_container.query_items( + query=conversations_query, + enable_cross_partition_query=True + )) + + debug_print(f"Found {len(conversations)} conversations to migrate") + + for conv in conversations: + try: + # Create activity log directly to preserve original timestamp + activity_log = { + 'id': str(uuid.uuid4()), + 'activity_type': 'conversation_creation', + 'user_id': conv.get('user_id'), + 'timestamp': conv.get('created_at') or conv.get('last_updated') or datetime.utcnow().isoformat(), + 'created_at': conv.get('created_at') or conv.get('last_updated') or datetime.utcnow().isoformat(), + 'conversation': { + 'conversation_id': conv.get('id'), + 'title': conv.get('title', 'Untitled'), + 'context': conv.get('context', []), + 'tags': conv.get('tags', []) + }, + 'workspace_type': 'personal', + 'workspace_context': {} + } + + # Save to activity logs container + cosmos_activity_logs_container.upsert_item(activity_log) + + # Add flag to conversation + conv['added_to_activity_log'] = True + cosmos_conversations_container.upsert_item(conv) + + results['conversations_migrated'] += 1 + + except Exception as conv_error: + results['conversations_failed'] += 1 + error_msg = f"Failed to migrate conversation {conv.get('id')}: {str(conv_error)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + except Exception as e: + error_msg = f"Error during conversation migration: {str(e)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + # Migrate personal documents + debug_print("Starting personal documents migration...") + try: + personal_docs_query = """ + SELECT * + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + personal_docs = list(cosmos_user_documents_container.query_items( + query=personal_docs_query, + enable_cross_partition_query=True + )) + + for doc in personal_docs: + try: + # Create activity log directly to preserve original timestamp + activity_log = { + 'id': str(uuid.uuid4()), + 'user_id': doc.get('user_id'), + 'activity_type': 'document_creation', + 'workspace_type': 'personal', + 'timestamp': doc.get('upload_date') or datetime.utcnow().isoformat(), + 'created_at': doc.get('upload_date') or datetime.utcnow().isoformat(), + 'document': { + 'document_id': doc.get('id'), + 'file_name': doc.get('file_name', 'Unknown'), + 'file_type': doc.get('file_type', 'unknown'), + 'file_size_bytes': doc.get('file_size', 0), + 'page_count': doc.get('number_of_pages', 0), + 'version': doc.get('version', 1) + }, + 'embedding_usage': { + 'total_tokens': doc.get('embedding_tokens', 0), + 'model_deployment_name': doc.get('embedding_model_deployment_name', 'unknown') + }, + 'document_metadata': { + 'author': doc.get('author'), + 'title': doc.get('title'), + 'subject': doc.get('subject'), + 'publication_date': doc.get('publication_date'), + 'keywords': doc.get('keywords', []), + 'abstract': doc.get('abstract') + }, + 'workspace_context': {} + } + + # Save to activity logs container + cosmos_activity_logs_container.upsert_item(activity_log) + + # Add flag to document + doc['added_to_activity_log'] = True + cosmos_user_documents_container.upsert_item(doc) + + results['personal_documents_migrated'] += 1 + + except Exception as doc_error: + results['personal_documents_failed'] += 1 + error_msg = f"Failed to migrate personal document {doc.get('id')}: {str(doc_error)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + except Exception as e: + error_msg = f"Error during personal documents migration: {str(e)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + # Migrate group documents + debug_print("Starting group documents migration...") + try: + group_docs_query = """ + SELECT * + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + group_docs = list(cosmos_group_documents_container.query_items( + query=group_docs_query, + enable_cross_partition_query=True + )) + + for doc in group_docs: + try: + # Create activity log directly to preserve original timestamp + activity_log = { + 'id': str(uuid.uuid4()), + 'user_id': doc.get('user_id'), + 'activity_type': 'document_creation', + 'workspace_type': 'group', + 'timestamp': doc.get('upload_date') or datetime.utcnow().isoformat(), + 'created_at': doc.get('upload_date') or datetime.utcnow().isoformat(), + 'document': { + 'document_id': doc.get('id'), + 'file_name': doc.get('file_name', 'Unknown'), + 'file_type': doc.get('file_type', 'unknown'), + 'file_size_bytes': doc.get('file_size', 0), + 'page_count': doc.get('number_of_pages', 0), + 'version': doc.get('version', 1) + }, + 'embedding_usage': { + 'total_tokens': doc.get('embedding_tokens', 0), + 'model_deployment_name': doc.get('embedding_model_deployment_name', 'unknown') + }, + 'document_metadata': { + 'author': doc.get('author'), + 'title': doc.get('title'), + 'subject': doc.get('subject'), + 'publication_date': doc.get('publication_date'), + 'keywords': doc.get('keywords', []), + 'abstract': doc.get('abstract') + }, + 'workspace_context': { + 'group_id': doc.get('group_id') + } + } + + # Save to activity logs container + cosmos_activity_logs_container.upsert_item(activity_log) + + # Add flag to document + doc['added_to_activity_log'] = True + cosmos_group_documents_container.upsert_item(doc) + + results['group_documents_migrated'] += 1 + + except Exception as doc_error: + results['group_documents_failed'] += 1 + error_msg = f"Failed to migrate group document {doc.get('id')}: {str(doc_error)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + except Exception as e: + error_msg = f"Error during group documents migration: {str(e)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + # Migrate public documents + debug_print("Starting public documents migration...") + try: + public_docs_query = """ + SELECT * + FROM c + WHERE NOT IS_DEFINED(c.added_to_activity_log) OR c.added_to_activity_log = false + """ + public_docs = list(cosmos_public_documents_container.query_items( + query=public_docs_query, + enable_cross_partition_query=True + )) + + for doc in public_docs: + try: + # Create activity log directly to preserve original timestamp + activity_log = { + 'id': str(uuid.uuid4()), + 'user_id': doc.get('user_id'), + 'activity_type': 'document_creation', + 'workspace_type': 'public', + 'timestamp': doc.get('upload_date') or datetime.utcnow().isoformat(), + 'created_at': doc.get('upload_date') or datetime.utcnow().isoformat(), + 'document': { + 'document_id': doc.get('id'), + 'file_name': doc.get('file_name', 'Unknown'), + 'file_type': doc.get('file_type', 'unknown'), + 'file_size_bytes': doc.get('file_size', 0), + 'page_count': doc.get('number_of_pages', 0), + 'version': doc.get('version', 1) + }, + 'embedding_usage': { + 'total_tokens': doc.get('embedding_tokens', 0), + 'model_deployment_name': doc.get('embedding_model_deployment_name', 'unknown') + }, + 'document_metadata': { + 'author': doc.get('author'), + 'title': doc.get('title'), + 'subject': doc.get('subject'), + 'publication_date': doc.get('publication_date'), + 'keywords': doc.get('keywords', []), + 'abstract': doc.get('abstract') + }, + 'workspace_context': { + 'public_workspace_id': doc.get('public_workspace_id') + } + } + + # Save to activity logs container + cosmos_activity_logs_container.upsert_item(activity_log) + + # Add flag to document + doc['added_to_activity_log'] = True + cosmos_public_documents_container.upsert_item(doc) + + results['public_documents_migrated'] += 1 + + except Exception as doc_error: + results['public_documents_failed'] += 1 + error_msg = f"Failed to migrate public document {doc.get('id')}: {str(doc_error)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + except Exception as e: + error_msg = f"Error during public documents migration: {str(e)}" + debug_print(error_msg) + results['errors'].append(error_msg) + + # Calculate totals + results['total_migrated'] = ( + results['conversations_migrated'] + + results['personal_documents_migrated'] + + results['group_documents_migrated'] + + results['public_documents_migrated'] + ) + + results['total_failed'] = ( + results['conversations_failed'] + + results['personal_documents_failed'] + + results['group_documents_failed'] + + results['public_documents_failed'] + ) + + debug_print(f"Migration complete: {results['total_migrated']} migrated, {results['total_failed']} failed") + + return jsonify(results), 200 + + except Exception as e: + debug_print(f"Error during migration: {e}") + import traceback + traceback.print_exc() + return jsonify({'error': f'Migration failed: {str(e)}'}), 500 + + @app.route('/api/admin/control-center/activity-logs', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_get_activity_logs(): + """ + Get paginated and filtered activity logs from cosmos_activity_logs_container. + Supports search and filtering by activity type. + """ + try: + # Get query parameters + page = int(request.args.get('page', 1)) + per_page = int(request.args.get('per_page', 50)) + search_term = request.args.get('search', '').strip().lower() + activity_type_filter = request.args.get('activity_type_filter', 'all').strip() + + # Build query conditions + query_conditions = [] + parameters = [] + + # Filter by activity type if not 'all' + if activity_type_filter and activity_type_filter != 'all': + query_conditions.append("c.activity_type = @activity_type") + parameters.append({"name": "@activity_type", "value": activity_type_filter}) + + # Build WHERE clause (empty if no conditions) + where_clause = " WHERE " + " AND ".join(query_conditions) if query_conditions else "" + + # Get total count for pagination + count_query = f"SELECT VALUE COUNT(1) FROM c{where_clause}" + total_items_result = list(cosmos_activity_logs_container.query_items( + query=count_query, + parameters=parameters, + enable_cross_partition_query=True + )) + total_items = total_items_result[0] if total_items_result and isinstance(total_items_result[0], int) else 0 + + # Calculate pagination + offset = (page - 1) * per_page + total_pages = (total_items + per_page - 1) // per_page if total_items > 0 else 1 + + # Get paginated results + logs_query = f""" + SELECT * FROM c{where_clause} + ORDER BY c.timestamp DESC + OFFSET {offset} LIMIT {per_page} + """ + + debug_print(f"Activity logs query: {logs_query}") + debug_print(f"Query parameters: {parameters}") + + logs = list(cosmos_activity_logs_container.query_items( + query=logs_query, + parameters=parameters, + enable_cross_partition_query=True + )) + + # Apply search filter in Python (after fetching from Cosmos) + if search_term: + filtered_logs = [] + for log in logs: + # Search in various fields + searchable_text = ' '.join([ + str(log.get('activity_type', '')), + str(log.get('user_id', '')), + str(log.get('login_method', '')), + str(log.get('conversation', {}).get('title', '')), + str(log.get('document', {}).get('file_name', '')), + str(log.get('token_type', '')), + str(log.get('workspace_type', '')) + ]).lower() + + if search_term in searchable_text: + filtered_logs.append(log) + + logs = filtered_logs + # Recalculate total_items for filtered results + total_items = len(logs) + total_pages = (total_items + per_page - 1) // per_page if total_items > 0 else 1 + + # Get unique user IDs from logs + user_ids = set(log.get('user_id') for log in logs if log.get('user_id')) + + # Fetch user information for display names/emails + user_map = {} + if user_ids: + for user_id in user_ids: + try: + user_doc = cosmos_user_settings_container.read_item( + item=user_id, + partition_key=user_id + ) + user_map[user_id] = { + 'email': user_doc.get('email', ''), + 'display_name': user_doc.get('display_name', '') + } + except: + user_map[user_id] = { + 'email': '', + 'display_name': '' + } + + return jsonify({ + 'logs': logs, + 'user_map': user_map, + 'pagination': { + 'page': page, + 'per_page': per_page, + 'total_items': total_items, + 'total_pages': total_pages, + 'has_prev': page > 1, + 'has_next': page < total_pages + } + }), 200 + + except Exception as e: + debug_print(f"Error getting activity logs: {e}") + import traceback + traceback.print_exc() + return jsonify({'error': 'Failed to fetch activity logs'}), 500 + + # ============================================================================ + # APPROVAL WORKFLOW ENDPOINTS + # ============================================================================ + + @app.route('/api/admin/control-center/approvals', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_get_approvals(): + """ + Get approval requests visible to the current user. + + Query Parameters: + page (int): Page number (default: 1) + page_size (int): Items per page (default: 20) + status (str): Filter by status (pending, approved, denied, all) + action_type (str): Filter by action type + search (str): Search by group name or reason + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + + # Get user roles from session + user_roles = user.get('roles', []) + + # Get query parameters + page = int(request.args.get('page', 1)) + page_size = int(request.args.get('page_size', 20)) + status_filter = request.args.get('status', 'all') + action_type_filter = request.args.get('action_type', 'all') + search_query = request.args.get('search', '') + + # Determine include_completed based on status filter + include_completed = (status_filter == 'all' or status_filter in ['approved', 'denied']) + + # Map action_type to request_type_filter + request_type_filter = None if action_type_filter == 'all' else action_type_filter + + # Fetch approvals + result = get_pending_approvals( + user_id=user_id, + user_roles=user_roles, + page=page, + per_page=page_size, + include_completed=include_completed, + request_type_filter=request_type_filter + ) + + # Add can_approve field to each approval + approvals_with_permission = [] + for approval in result.get('approvals', []): + approval_copy = dict(approval) + # User can approve if they didn't create the request OR if they're the only admin + approval_copy['can_approve'] = (approval.get('requester_id') != user_id) + approvals_with_permission.append(approval_copy) + + # Rename fields to match frontend expectations + return jsonify({ + 'success': True, + 'approvals': approvals_with_permission, + 'total_count': result.get('total', 0), + 'page': result.get('page', 1), + 'page_size': result.get('per_page', page_size), + 'total_pages': result.get('total_pages', 0) + }), 200 + + except Exception as e: + debug_print(f"Error fetching approvals: {e}") + import traceback + debug_print(traceback.format_exc()) + return jsonify({'error': 'Failed to fetch approvals', 'details': str(e)}), 500 + + @app.route('/api/admin/control-center/approvals/', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_get_approval_by_id(approval_id): + """ + Get a single approval request by ID. + + Query Parameters: + group_id (str): Group ID (partition key) + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + + group_id = request.args.get('group_id') + if not group_id: + return jsonify({'error': 'group_id query parameter is required'}), 400 + + # Get the approval + approval = cosmos_approvals_container.read_item( + item=approval_id, + partition_key=group_id + ) + + # Add can_approve field + approval['can_approve'] = (approval.get('requester_id') != user_id) + + return jsonify(approval), 200 + + except Exception as e: + debug_print(f"Error fetching approval {approval_id}: {e}") + import traceback + debug_print(traceback.format_exc()) + return jsonify({'error': 'Failed to fetch approval', 'details': str(e)}), 500 + + @app.route('/api/admin/control-center/approvals//approve', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_approve_request(approval_id): + """ + Approve an approval request and execute the action. + + Body: + group_id (str): Group ID (partition key) + comment (str, optional): Approval comment + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + user_email = user.get('preferred_username', user.get('email', 'unknown')) + user_name = user.get('name', user_email) + + data = request.get_json() + group_id = data.get('group_id') + comment = data.get('comment', '') + + if not group_id: + return jsonify({'error': 'group_id is required'}), 400 + + # Approve the request + approval = approve_request( + approval_id=approval_id, + group_id=group_id, + approver_id=user_id, + approver_email=user_email, + approver_name=user_name, + comment=comment + ) + + # Execute the approved action + execution_result = _execute_approved_action(approval, user_id, user_email, user_name) + + return jsonify({ + 'success': True, + 'message': 'Request approved and executed', + 'approval': approval, + 'execution_result': execution_result + }), 200 + + except Exception as e: + debug_print(f"Error approving request: {e}") + return jsonify({'error': str(e)}), 500 + + @app.route('/api/admin/control-center/approvals//deny', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @control_center_required('admin') + def api_admin_deny_request(approval_id): + """ + Deny an approval request. + + Body: + group_id (str): Group ID (partition key) + comment (str): Reason for denial (required) + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + user_email = user.get('preferred_username', user.get('email', 'unknown')) + user_name = user.get('name', user_email) + + data = request.get_json() + group_id = data.get('group_id') + comment = data.get('comment', '') + + if not group_id: + return jsonify({'error': 'group_id is required'}), 400 + + if not comment: + return jsonify({'error': 'comment is required for denial'}), 400 + + # Deny the request + approval = deny_request( + approval_id=approval_id, + group_id=group_id, + denier_id=user_id, + denier_email=user_email, + denier_name=user_name, + comment=comment, + auto_denied=False + ) + + return jsonify({ + 'success': True, + 'message': 'Request denied', + 'approval': approval + }), 200 + + except Exception as e: + debug_print(f"Error denying request: {e}") + return jsonify({'error': str(e)}), 500 + + # New standalone approvals API endpoints (accessible to all users with permissions) + @app.route('/api/approvals', methods=['GET']) + @login_required + def api_get_approvals(): + """ + Get approval requests visible to the current user (admins, control center admins, and group owners). + + Query Parameters: + page (int): Page number (default: 1) + page_size (int): Items per page (default: 20) + status (str): Filter by status (pending, approved, denied, all) + action_type (str): Filter by action type + search (str): Search by group name or reason + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + user_roles = user.get('roles', []) + + # Get query parameters + page = int(request.args.get('page', 1)) + page_size = int(request.args.get('page_size', 20)) + status_filter = request.args.get('status', 'pending') + action_type_filter = request.args.get('action_type', 'all') + search_query = request.args.get('search', '') + + debug_print(f"📋 [APPROVALS API] Fetching approvals - status_filter: {status_filter}, action_type: {action_type_filter}") + + # Determine include_completed based on status filter + # 'all' means show everything, specific statuses mean show only those + include_completed = (status_filter in ['all', 'approved', 'denied', 'executed']) + + debug_print(f"📋 [APPROVALS API] include_completed: {include_completed}") + + # Map action_type to request_type_filter + request_type_filter = None if action_type_filter == 'all' else action_type_filter + + # Fetch approvals + result = get_pending_approvals( + user_id=user_id, + user_roles=user_roles, + page=page, + per_page=page_size, + include_completed=include_completed, + request_type_filter=request_type_filter, + status_filter=status_filter + ) + + # Add can_approve field to each approval + approvals_with_permission = [] + for approval in result.get('approvals', []): + approval_copy = dict(approval) + # User can approve if they didn't create the request + approval_copy['can_approve'] = (approval.get('requester_id') != user_id) + approvals_with_permission.append(approval_copy) + + return jsonify({ + 'success': True, + 'approvals': approvals_with_permission, + 'total_count': result.get('total', 0), + 'page': result.get('page', 1), + 'page_size': result.get('per_page', page_size), + 'total_pages': result.get('total_pages', 0) + }), 200 + + except Exception as e: + debug_print(f"Error fetching approvals: {e}") + import traceback + debug_print(traceback.format_exc()) + return jsonify({'error': 'Failed to fetch approvals', 'details': str(e)}), 500 + + @app.route('/api/approvals/', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + def api_get_approval_by_id(approval_id): + """ + Get a single approval request by ID. + + Query Parameters: + group_id (str): Group ID (partition key) + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + + group_id = request.args.get('group_id') + if not group_id: + return jsonify({'error': 'group_id query parameter is required'}), 400 + + # Get the approval + approval = cosmos_approvals_container.read_item( + item=approval_id, + partition_key=group_id + ) + + # Add can_approve field + approval['can_approve'] = (approval.get('requester_id') != user_id) + + return jsonify(approval), 200 + + except Exception as e: + debug_print(f"Error fetching approval {approval_id}: {e}") + import traceback + debug_print(traceback.format_exc()) + return jsonify({'error': 'Failed to fetch approval', 'details': str(e)}), 500 + + @app.route('/api/approvals//approve', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + def api_approve_request(approval_id): + """ + Approve an approval request and execute the action. + + Body: + group_id (str): Group ID (partition key) + comment (str, optional): Approval comment + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + user_email = user.get('preferred_username', user.get('email', 'unknown')) + user_name = user.get('name', user_email) + + data = request.get_json() + group_id = data.get('group_id') + comment = data.get('comment', '') + + if not group_id: + return jsonify({'error': 'group_id is required'}), 400 + + # Approve the request + approval = approve_request( + approval_id=approval_id, + group_id=group_id, + approver_id=user_id, + approver_email=user_email, + approver_name=user_name, + comment=comment + ) + + # Execute the approved action + execution_result = _execute_approved_action(approval, user_id, user_email, user_name) + + return jsonify({ + 'success': True, + 'message': 'Request approved and executed', + 'approval': approval, + 'execution_result': execution_result + }), 200 + + except Exception as e: + debug_print(f"Error approving request: {e}") + return jsonify({'error': str(e)}), 500 + + @app.route('/api/approvals//deny', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + def api_deny_request(approval_id): + """ + Deny an approval request. + + Body: + group_id (str): Group ID (partition key) + comment (str): Reason for denial (required) + """ + try: + user = session.get('user', {}) + user_id = user.get('oid') or user.get('sub') + user_email = user.get('preferred_username', user.get('email', 'unknown')) + user_name = user.get('name', user_email) + + data = request.get_json() + group_id = data.get('group_id') + comment = data.get('comment', '') + + if not group_id: + return jsonify({'error': 'group_id is required'}), 400 + + if not comment: + return jsonify({'error': 'comment is required for denial'}), 400 + + # Deny the request + approval = deny_request( + approval_id=approval_id, + group_id=group_id, + denier_id=user_id, + denier_email=user_email, + denier_name=user_name, + comment=comment, + auto_denied=False + ) + + return jsonify({ + 'success': True, + 'message': 'Request denied', + 'approval': approval + }), 200 + + except Exception as e: + debug_print(f"Error denying request: {e}") + return jsonify({'error': str(e)}), 500 + + def _execute_approved_action(approval, executor_id, executor_email, executor_name): + """ + Execute the action specified in an approved request. + + Args: + approval: Approved request document + executor_id: User ID executing the action + executor_email: Email of executor + executor_name: Display name of executor + + Returns: + Result dictionary with success status and message + """ + try: + request_type = approval['request_type'] + group_id = approval['group_id'] + + if request_type == TYPE_TAKE_OWNERSHIP: + # Execute take ownership + # Check if this is for a public workspace or group + if approval.get('metadata', {}).get('entity_type') == 'workspace': + result = _execute_take_workspace_ownership(approval, executor_id, executor_email, executor_name) + else: + result = _execute_take_ownership(approval, executor_id, executor_email, executor_name) + + elif request_type == TYPE_TRANSFER_OWNERSHIP: + # Execute transfer ownership + # Check if this is for a public workspace or group + if approval.get('metadata', {}).get('entity_type') == 'workspace': + result = _execute_transfer_workspace_ownership(approval, executor_id, executor_email, executor_name) + else: + result = _execute_transfer_ownership(approval, executor_id, executor_email, executor_name) + + elif request_type == TYPE_DELETE_DOCUMENTS: + # Check if this is for a public workspace or group + if approval.get('metadata', {}).get('entity_type') == 'workspace': + result = _execute_delete_public_workspace_documents(approval, executor_id, executor_email, executor_name) + else: + result = _execute_delete_documents(approval, executor_id, executor_email, executor_name) + + elif request_type == TYPE_DELETE_GROUP: + # Check if this is for a public workspace or group + if approval.get('metadata', {}).get('entity_type') == 'workspace': + result = _execute_delete_public_workspace(approval, executor_id, executor_email, executor_name) + else: + result = _execute_delete_group(approval, executor_id, executor_email, executor_name) + + elif request_type == TYPE_DELETE_USER_DOCUMENTS: + # Execute delete user documents + result = _execute_delete_user_documents(approval, executor_id, executor_email, executor_name) + + else: + result = {'success': False, 'message': f'Unknown request type: {request_type}'} + + # Mark approval as executed + mark_approval_executed( + approval_id=approval['id'], + group_id=group_id, + success=result['success'], + result_message=result['message'] + ) + + return result + + except Exception as e: + # Mark as failed + mark_approval_executed( + approval_id=approval['id'], + group_id=approval['group_id'], + success=False, + result_message=f"Execution error: {str(e)}" + ) + raise + + def _execute_take_ownership(approval, executor_id, executor_email, executor_name): + """Execute admin take ownership action.""" + try: + group_id = approval['group_id'] + requester_id = approval['requester_id'] + requester_email = approval['requester_email'] + + # Get the group + group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) + + old_owner = group.get('owner', {}) + old_owner_id = old_owner.get('id') + old_owner_email = old_owner.get('email', 'unknown') + + # Update owner to requester (the admin who requested) + group['owner'] = { + 'id': requester_id, + 'email': requester_email, + 'displayName': approval['requester_name'] + } + + # Remove requester from special roles if present + if requester_id in group.get('admins', []): + group['admins'].remove(requester_id) + if requester_id in group.get('documentManagers', []): + group['documentManagers'].remove(requester_id) + + # Ensure requester is in users list + requester_in_users = any(m.get('userId') == requester_id for m in group.get('users', [])) + if not requester_in_users: + group.setdefault('users', []).append({ + 'userId': requester_id, + 'email': requester_email, + 'displayName': approval['requester_name'] + }) + + # Demote old owner to regular member + if old_owner_id: + old_owner_in_users = any(m.get('userId') == old_owner_id for m in group.get('users', [])) + if not old_owner_in_users: + group.setdefault('users', []).append({ + 'userId': old_owner_id, + 'email': old_owner_email, + 'displayName': old_owner.get('displayName', old_owner_email) + }) + + if old_owner_id in group.get('admins', []): + group['admins'].remove(old_owner_id) + if old_owner_id in group.get('documentManagers', []): + group['documentManagers'].remove(old_owner_id) + + group['modifiedDate'] = datetime.utcnow().isoformat() + cosmos_groups_container.upsert_item(group) + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'group_ownership_change', + 'activity_type': 'admin_take_ownership_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'admin_user_id': requester_id, + 'admin_email': requester_email, + 'approver_id': executor_id, + 'approver_email': executor_email, + 'group_id': group_id, + 'group_name': group.get('name', 'Unknown'), + 'old_owner_id': old_owner_id, + 'old_owner_email': old_owner_email, + 'new_owner_id': requester_id, + 'new_owner_email': requester_email, + 'approval_id': approval['id'], + 'description': f"Admin {requester_email} took ownership (approved by {executor_email})" + } + cosmos_activity_logs_container.create_item(body=activity_record) + + return { + 'success': True, + 'message': f'Ownership transferred to {requester_email}' + } + + except Exception as e: + return {'success': False, 'message': f'Failed to take ownership: {str(e)}'} + + def _execute_take_workspace_ownership(approval, executor_id, executor_email, executor_name): + """Execute admin take workspace ownership action.""" + try: + workspace_id = approval.get('workspace_id') or approval.get('group_id') + requester_id = approval['requester_id'] + requester_email = approval['requester_email'] + requester_name = approval['requester_name'] + + # Get the workspace + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + + # Get old owner info + old_owner = workspace.get('owner', {}) + if isinstance(old_owner, dict): + old_owner_id = old_owner.get('userId') + old_owner_email = old_owner.get('email') + old_owner_name = old_owner.get('displayName') + else: + # Old format where owner is just a string + old_owner_id = old_owner + # Try to get user info + try: + old_owner_user = cosmos_user_settings_container.read_item( + item=old_owner_id, + partition_key=old_owner_id + ) + old_owner_email = old_owner_user.get('email', 'unknown') + old_owner_name = old_owner_user.get('display_name', old_owner_email) + except: + old_owner_email = 'unknown' + old_owner_name = 'unknown' + + # Update owner to requester (the admin who requested) with full user object + workspace['owner'] = { + 'userId': requester_id, + 'email': requester_email, + 'displayName': requester_name + } + + # Remove requester from admins/documentManagers if present + new_admins = [] + for admin in workspace.get('admins', []): + admin_id = admin.get('userId') if isinstance(admin, dict) else admin + if admin_id != requester_id: + # Ensure admin is full object + if isinstance(admin, dict): + new_admins.append(admin) + else: + # Convert string ID to object if needed + try: + admin_user = cosmos_user_settings_container.read_item( + item=admin, + partition_key=admin + ) + new_admins.append({ + 'userId': admin, + 'email': admin_user.get('email', 'unknown'), + 'displayName': admin_user.get('display_name', 'unknown') + }) + except: + pass + workspace['admins'] = new_admins + + new_dms = [] + for dm in workspace.get('documentManagers', []): + dm_id = dm.get('userId') if isinstance(dm, dict) else dm + if dm_id != requester_id: + # Ensure dm is full object + if isinstance(dm, dict): + new_dms.append(dm) + else: + # Convert string ID to object if needed + try: + dm_user = cosmos_user_settings_container.read_item( + item=dm, + partition_key=dm + ) + new_dms.append({ + 'userId': dm, + 'email': dm_user.get('email', 'unknown'), + 'displayName': dm_user.get('display_name', 'unknown') + }) + except: + pass + workspace['documentManagers'] = new_dms + + # Demote old owner to admin if not already there + if old_owner_id and old_owner_id != requester_id: + old_owner_in_admins = any( + (a.get('userId') if isinstance(a, dict) else a) == old_owner_id + for a in workspace.get('admins', []) + ) + old_owner_in_dms = any( + (dm.get('userId') if isinstance(dm, dict) else dm) == old_owner_id + for dm in workspace.get('documentManagers', []) + ) + + if not old_owner_in_admins and not old_owner_in_dms: + # Add old owner as admin + workspace.setdefault('admins', []).append({ + 'userId': old_owner_id, + 'email': old_owner_email, + 'displayName': old_owner_name + }) + + workspace['modifiedDate'] = datetime.utcnow().isoformat() + cosmos_public_workspaces_container.upsert_item(workspace) + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'workspace_ownership_change', + 'activity_type': 'admin_take_ownership_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': requester_id, + 'requester_email': requester_email, + 'approver_id': executor_id, + 'approver_email': executor_email, + 'workspace_id': workspace_id, + 'workspace_name': workspace.get('name', 'Unknown'), + 'old_owner_id': old_owner_id, + 'old_owner_email': old_owner_email, + 'new_owner_id': requester_id, + 'new_owner_email': requester_email, + 'approval_id': approval['id'], + 'description': f"Admin {requester_email} took ownership (approved by {executor_email})" + } + cosmos_activity_logs_container.create_item(body=activity_record) + + return { + 'success': True, + 'message': f"Ownership transferred to {requester_email}" + } + + except Exception as e: + return {'success': False, 'message': f'Failed to take workspace ownership: {str(e)}'} + + def _execute_transfer_ownership(approval, executor_id, executor_email, executor_name): + """Execute transfer ownership action.""" + try: + group_id = approval['group_id'] + new_owner_id = approval['metadata'].get('new_owner_id') + + if not new_owner_id: + return {'success': False, 'message': 'new_owner_id not found in approval metadata'} + + # Get the group + group = cosmos_groups_container.read_item(item=group_id, partition_key=group_id) + + # Find new owner in members + new_owner_member = None + for member in group.get('users', []): + if member.get('userId') == new_owner_id: + new_owner_member = member + break + + if not new_owner_member: + return {'success': False, 'message': 'New owner not found in group members'} + + old_owner = group.get('owner', {}) + old_owner_id = old_owner.get('id') + + # Update owner + group['owner'] = { + 'id': new_owner_id, + 'email': new_owner_member.get('email'), + 'displayName': new_owner_member.get('displayName') + } + + # Remove new owner from special roles + if new_owner_id in group.get('admins', []): + group['admins'].remove(new_owner_id) + if new_owner_id in group.get('documentManagers', []): + group['documentManagers'].remove(new_owner_id) + + # Demote old owner to member + if old_owner_id: + old_owner_in_users = any(m.get('userId') == old_owner_id for m in group.get('users', [])) + if not old_owner_in_users: + group.setdefault('users', []).append({ + 'userId': old_owner_id, + 'email': old_owner.get('email'), + 'displayName': old_owner.get('displayName') + }) + + if old_owner_id in group.get('admins', []): + group['admins'].remove(old_owner_id) + if old_owner_id in group.get('documentManagers', []): + group['documentManagers'].remove(old_owner_id) + + group['modifiedDate'] = datetime.utcnow().isoformat() + cosmos_groups_container.upsert_item(group) + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'group_ownership_change', + 'activity_type': 'transfer_ownership_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': approval['requester_id'], + 'requester_email': approval['requester_email'], + 'approver_id': executor_id, + 'approver_email': executor_email, + 'group_id': group_id, + 'group_name': group.get('name', 'Unknown'), + 'old_owner_id': old_owner_id, + 'old_owner_email': old_owner.get('email'), + 'new_owner_id': new_owner_id, + 'new_owner_email': new_owner_member.get('email'), + 'approval_id': approval['id'], + 'description': f"Ownership transferred to {new_owner_member.get('email')} (approved by {executor_email})" + } + cosmos_activity_logs_container.create_item(body=activity_record) + + return { + 'success': True, + 'message': f"Ownership transferred to {new_owner_member.get('email')}" + } + + except Exception as e: + return {'success': False, 'message': f'Failed to transfer ownership: {str(e)}'} + + def _execute_transfer_workspace_ownership(approval, executor_id, executor_email, executor_name): + """Execute transfer workspace ownership action.""" + try: + workspace_id = approval.get('workspace_id') or approval.get('group_id') + new_owner_id = approval['metadata'].get('new_owner_id') + new_owner_email = approval['metadata'].get('new_owner_email') + new_owner_name = approval['metadata'].get('new_owner_name') + + if not new_owner_id: + return {'success': False, 'message': 'new_owner_id not found in approval metadata'} + + # Get the workspace + workspace = cosmos_public_workspaces_container.read_item(item=workspace_id, partition_key=workspace_id) + + # Get old owner info + old_owner = workspace.get('owner', {}) + if isinstance(old_owner, dict): + old_owner_id = old_owner.get('userId') + old_owner_email = old_owner.get('email') + old_owner_name = old_owner.get('displayName') + else: + # Handle case where owner is just a string (old format) + old_owner_id = old_owner + # Try to get full user info + try: + old_owner_user = cosmos_user_settings_container.read_item( + item=old_owner_id, + partition_key=old_owner_id + ) + old_owner_email = old_owner_user.get('email', 'unknown') + old_owner_name = old_owner_user.get('display_name', old_owner_email) + except: + old_owner_email = 'unknown' + old_owner_name = 'unknown' + + # Update owner with full user object + workspace['owner'] = { + 'userId': new_owner_id, + 'email': new_owner_email, + 'displayName': new_owner_name + } + + # Remove new owner from admins/documentManagers if present + new_admins = [] + for admin in workspace.get('admins', []): + admin_id = admin.get('userId') if isinstance(admin, dict) else admin + if admin_id != new_owner_id: + # Ensure admin is full object + if isinstance(admin, dict): + new_admins.append(admin) + else: + # Convert string ID to object if needed + try: + admin_user = cosmos_user_settings_container.read_item( + item=admin, + partition_key=admin + ) + new_admins.append({ + 'userId': admin, + 'email': admin_user.get('email', 'unknown'), + 'displayName': admin_user.get('display_name', 'unknown') + }) + except: + pass + workspace['admins'] = new_admins + + new_dms = [] + for dm in workspace.get('documentManagers', []): + dm_id = dm.get('userId') if isinstance(dm, dict) else dm + if dm_id != new_owner_id: + # Ensure dm is full object + if isinstance(dm, dict): + new_dms.append(dm) + else: + # Convert string ID to object if needed + try: + dm_user = cosmos_user_settings_container.read_item( + item=dm, + partition_key=dm + ) + new_dms.append({ + 'userId': dm, + 'email': dm_user.get('email', 'unknown'), + 'displayName': dm_user.get('display_name', 'unknown') + }) + except: + pass + workspace['documentManagers'] = new_dms + + # Add old owner to admins if not already there + if old_owner_id and old_owner_id != new_owner_id: + old_owner_in_admins = any( + (a.get('userId') if isinstance(a, dict) else a) == old_owner_id + for a in workspace.get('admins', []) + ) + old_owner_in_dms = any( + (dm.get('userId') if isinstance(dm, dict) else dm) == old_owner_id + for dm in workspace.get('documentManagers', []) + ) + + if not old_owner_in_admins and not old_owner_in_dms: + # Add old owner as admin + workspace.setdefault('admins', []).append({ + 'userId': old_owner_id, + 'email': old_owner_email, + 'displayName': old_owner_name + }) + + workspace['modifiedDate'] = datetime.utcnow().isoformat() + cosmos_public_workspaces_container.upsert_item(workspace) + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'workspace_ownership_change', + 'activity_type': 'transfer_ownership_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': approval['requester_id'], + 'requester_email': approval['requester_email'], + 'approver_id': executor_id, + 'approver_email': executor_email, + 'workspace_id': workspace_id, + 'workspace_name': workspace.get('name', 'Unknown'), + 'old_owner_id': old_owner_id, + 'old_owner_email': old_owner_email, + 'new_owner_id': new_owner_id, + 'new_owner_email': new_owner_email, + 'approval_id': approval['id'], + 'description': f"Ownership transferred to {new_owner_email} (approved by {executor_email})" + } + cosmos_activity_logs_container.create_item(body=activity_record) + + return { + 'success': True, + 'message': f"Ownership transferred to {new_owner_email}" + } + + except Exception as e: + return {'success': False, 'message': f'Failed to transfer workspace ownership: {str(e)}'} + + def _execute_delete_documents(approval, executor_id, executor_email, executor_name): + """Execute delete all documents action.""" + try: + group_id = approval['group_id'] + + debug_print(f"🔍 [DELETE_GROUP_DOCS] Starting deletion for group_id: {group_id}") + + # Query all document metadata for this group + query = "SELECT * FROM c WHERE c.group_id = @group_id AND c.type = 'document_metadata'" + parameters = [{"name": "@group_id", "value": group_id}] + + debug_print(f"🔍 [DELETE_GROUP_DOCS] Query: {query}") + debug_print(f"🔍 [DELETE_GROUP_DOCS] Parameters: {parameters}") + debug_print(f"🔍 [DELETE_GROUP_DOCS] Using partition_key: {group_id}") + + # Query with partition key for better performance + documents = list(cosmos_group_documents_container.query_items( + query=query, + parameters=parameters, + partition_key=group_id + )) + + debug_print(f"📊 [DELETE_GROUP_DOCS] Found {len(documents)} documents with partition key query") + + # If no documents found with partition key, try cross-partition query + if len(documents) == 0: + debug_print(f"⚠️ [DELETE_GROUP_DOCS] No documents found with partition key, trying cross-partition query") + documents = list(cosmos_group_documents_container.query_items( + query=query, + parameters=parameters, enable_cross_partition_query=True )) + debug_print(f"📊 [DELETE_GROUP_DOCS] Cross-partition query found {len(documents)} documents") - for doc in public_docs: - try: - # Create activity log directly to preserve original timestamp - activity_log = { - 'id': str(uuid.uuid4()), - 'user_id': doc.get('user_id'), - 'activity_type': 'document_creation', - 'workspace_type': 'public', - 'timestamp': doc.get('upload_date') or datetime.utcnow().isoformat(), - 'created_at': doc.get('upload_date') or datetime.utcnow().isoformat(), - 'document': { - 'document_id': doc.get('id'), - 'file_name': doc.get('file_name', 'Unknown'), - 'file_type': doc.get('file_type', 'unknown'), - 'file_size_bytes': doc.get('file_size', 0), - 'page_count': doc.get('number_of_pages', 0), - 'version': doc.get('version', 1) - }, - 'embedding_usage': { - 'total_tokens': doc.get('embedding_tokens', 0), - 'model_deployment_name': doc.get('embedding_model_deployment_name', 'unknown') - }, - 'document_metadata': { - 'author': doc.get('author'), - 'title': doc.get('title'), - 'subject': doc.get('subject'), - 'publication_date': doc.get('publication_date'), - 'keywords': doc.get('keywords', []), - 'abstract': doc.get('abstract') - }, - 'workspace_context': { - 'public_workspace_id': doc.get('public_workspace_id') - } - } - - # Save to activity logs container - cosmos_activity_logs_container.upsert_item(activity_log) - - # Add flag to document - doc['added_to_activity_log'] = True - cosmos_public_documents_container.upsert_item(doc) - - results['public_documents_migrated'] += 1 - - except Exception as doc_error: - results['public_documents_failed'] += 1 - error_msg = f"Failed to migrate public document {doc.get('id')}: {str(doc_error)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) - - except Exception as e: - error_msg = f"Error during public documents migration: {str(e)}" - current_app.logger.error(error_msg) - results['errors'].append(error_msg) + # Log sample document for debugging + if len(documents) > 0: + sample_doc = documents[0] + debug_print(f"📄 [DELETE_GROUP_DOCS] Sample document structure: id={sample_doc.get('id')}, type={sample_doc.get('type')}, group_id={sample_doc.get('group_id')}") - # Calculate totals - results['total_migrated'] = ( - results['conversations_migrated'] + - results['personal_documents_migrated'] + - results['group_documents_migrated'] + - results['public_documents_migrated'] - ) + deleted_count = 0 - results['total_failed'] = ( - results['conversations_failed'] + - results['personal_documents_failed'] + - results['group_documents_failed'] + - results['public_documents_failed'] - ) + # Use proper deletion APIs for each document + for doc in documents: + try: + doc_id = doc['id'] + debug_print(f"🗑️ [DELETE_GROUP_DOCS] Deleting document {doc_id}") + + # Use delete_document API which handles: + # - Blob storage deletion + # - AI Search index deletion + # - Cosmos DB metadata deletion + # Note: For group documents, we don't have a user_id, so we pass None + delete_result = delete_document( + user_id=None, + document_id=doc_id, + group_id=group_id + ) + + # Check if delete_result is valid and successful + if delete_result and delete_result.get('success'): + # Delete document chunks using proper API + delete_document_chunks( + document_id=doc_id, + group_id=group_id + ) + + deleted_count += 1 + debug_print(f"✅ [DELETE_GROUP_DOCS] Successfully deleted document {doc_id}") + else: + error_msg = delete_result.get('message') if delete_result else 'delete_document returned None' + debug_print(f"❌ [DELETE_GROUP_DOCS] Failed to delete document {doc_id}: {error_msg}") + + except Exception as doc_error: + debug_print(f"❌ [DELETE_GROUP_DOCS] Error deleting document {doc.get('id')}: {doc_error}") - current_app.logger.info(f"Migration complete: {results['total_migrated']} migrated, {results['total_failed']} failed") + # Invalidate group search cache after deletion + try: + invalidate_group_search_cache(group_id) + debug_print(f"🔄 [DELETE_GROUP_DOCS] Invalidated search cache for group {group_id}") + except Exception as cache_error: + debug_print(f"⚠️ [DELETE_GROUP_DOCS] Could not invalidate search cache: {cache_error}") + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'group_documents_deletion', + 'activity_type': 'delete_all_documents_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': approval['requester_id'], + 'requester_email': approval['requester_email'], + 'approver_id': executor_id, + 'approver_email': executor_email, + 'group_id': group_id, + 'group_name': approval['group_name'], + 'documents_deleted': deleted_count, + 'approval_id': approval['id'], + 'description': f"All documents deleted from group (approved by {executor_email})" + } + cosmos_activity_logs_container.create_item(body=activity_record) - return jsonify(results), 200 + debug_print(f"[ControlCenter] Group Documents Deleted (Approved) -- group_id: {group_id}, documents_deleted: {deleted_count}") + + return { + 'success': True, + 'message': f'Deleted {deleted_count} documents' + } except Exception as e: - current_app.logger.error(f"Error during migration: {e}") - import traceback - traceback.print_exc() - return jsonify({'error': f'Migration failed: {str(e)}'}), 500 + debug_print(f"[DELETE_GROUP_DOCS] Fatal error: {e}") + return {'success': False, 'message': f'Failed to delete documents: {str(e)}'} - @app.route('/api/admin/control-center/activity-logs', methods=['GET']) - @swagger_route(security=get_auth_security()) - @login_required - @admin_required - @control_center_admin_required - def api_get_activity_logs(): - """ - Get paginated and filtered activity logs from cosmos_activity_logs_container. - Supports search and filtering by activity type. - """ + def _execute_delete_public_workspace_documents(approval, executor_id, executor_email, executor_name): + """Execute delete all documents in a public workspace.""" try: - # Get query parameters - page = int(request.args.get('page', 1)) - per_page = int(request.args.get('per_page', 50)) - search_term = request.args.get('search', '').strip().lower() - activity_type_filter = request.args.get('activity_type_filter', 'all').strip() + workspace_id = approval['group_id'] # workspace_id is stored as group_id - # Build query conditions - query_conditions = [] - parameters = [] + debug_print(f"🔍 [DELETE_WORKSPACE_DOCS] Starting deletion for workspace_id: {workspace_id}") - # Filter by activity type if not 'all' - if activity_type_filter and activity_type_filter != 'all': - query_conditions.append("c.activity_type = @activity_type") - parameters.append({"name": "@activity_type", "value": activity_type_filter}) + # Query all documents for this workspace + query = "SELECT c.id FROM c WHERE c.public_workspace_id = @workspace_id" + parameters = [{"name": "@workspace_id", "value": workspace_id}] - # Build WHERE clause (empty if no conditions) - where_clause = " WHERE " + " AND ".join(query_conditions) if query_conditions else "" + debug_print(f"🔍 [DELETE_WORKSPACE_DOCS] Query: {query}") + debug_print(f"🔍 [DELETE_WORKSPACE_DOCS] Parameters: {parameters}") - # Get total count for pagination - count_query = f"SELECT VALUE COUNT(1) FROM c{where_clause}" - total_items_result = list(cosmos_activity_logs_container.query_items( - query=count_query, + documents = list(cosmos_public_documents_container.query_items( + query=query, parameters=parameters, enable_cross_partition_query=True )) - total_items = total_items_result[0] if total_items_result and isinstance(total_items_result[0], int) else 0 - # Calculate pagination - offset = (page - 1) * per_page - total_pages = (total_items + per_page - 1) // per_page if total_items > 0 else 1 + debug_print(f"📊 [DELETE_WORKSPACE_DOCS] Found {len(documents)} documents") - # Get paginated results - logs_query = f""" - SELECT * FROM c{where_clause} - ORDER BY c.timestamp DESC - OFFSET {offset} LIMIT {per_page} - """ + deleted_count = 0 + for doc in documents: + try: + doc_id = doc['id'] + debug_print(f"🗑️ [DELETE_WORKSPACE_DOCS] Deleting document {doc_id}") + + # Delete document chunks and metadata using proper APIs + delete_document_chunks( + document_id=doc_id, + public_workspace_id=workspace_id + ) + + delete_document( + user_id=None, + document_id=doc_id, + public_workspace_id=workspace_id + ) + + deleted_count += 1 + debug_print(f"✅ [DELETE_WORKSPACE_DOCS] Successfully deleted document {doc_id}") + + except Exception as doc_error: + debug_print(f"❌ [DELETE_WORKSPACE_DOCS] Error deleting document {doc_id}: {doc_error}") + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'public_workspace_documents_deletion', + 'activity_type': 'delete_all_documents_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': approval['requester_id'], + 'requester_email': approval['requester_email'], + 'approver_id': executor_id, + 'approver_email': executor_email, + 'workspace_id': workspace_id, + 'workspace_name': approval.get('metadata', {}).get('workspace_name', 'Unknown'), + 'documents_deleted': deleted_count, + 'approval_id': approval['id'], + 'description': f"All documents deleted from public workspace (approved by {executor_email})", + 'workspace_context': { + 'public_workspace_id': workspace_id + } + } + cosmos_activity_logs_container.create_item(body=activity_record) - current_app.logger.info(f"Activity logs query: {logs_query}") - current_app.logger.info(f"Query parameters: {parameters}") + debug_print(f"[ControlCenter] Public Workspace Documents Deleted (Approved) -- workspace_id: {workspace_id}, documents_deleted: {deleted_count}") - logs = list(cosmos_activity_logs_container.query_items( - query=logs_query, + return { + 'success': True, + 'message': f'Deleted {deleted_count} documents from public workspace' + } + + except Exception as e: + debug_print(f"[DELETE_WORKSPACE_DOCS] Fatal error: {e}") + return {'success': False, 'message': f'Failed to delete workspace documents: {str(e)}'} + + def _execute_delete_public_workspace(approval, executor_id, executor_email, executor_name): + """Execute delete entire public workspace action.""" + try: + workspace_id = approval['group_id'] # workspace_id is stored as group_id + + debug_print(f"🔍 [DELETE_WORKSPACE] Starting deletion for workspace_id: {workspace_id}") + + # First delete all documents + doc_result = _execute_delete_public_workspace_documents(approval, executor_id, executor_email, executor_name) + + if not doc_result['success']: + return doc_result + + # Delete the workspace itself + try: + cosmos_public_workspaces_container.delete_item( + item=workspace_id, + partition_key=workspace_id + ) + debug_print(f"✅ [DELETE_WORKSPACE] Successfully deleted workspace {workspace_id}") + except Exception as del_e: + debug_print(f"❌ [DELETE_WORKSPACE] Error deleting workspace {workspace_id}: {del_e}") + return {'success': False, 'message': f'Failed to delete workspace: {str(del_e)}'} + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'public_workspace_deletion', + 'activity_type': 'delete_workspace_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': approval['requester_id'], + 'requester_email': approval['requester_email'], + 'approver_id': executor_id, + 'approver_email': executor_email, + 'workspace_id': workspace_id, + 'workspace_name': approval.get('metadata', {}).get('workspace_name', 'Unknown'), + 'approval_id': approval['id'], + 'description': f"Public workspace completely deleted (approved by {executor_email})", + 'workspace_context': { + 'public_workspace_id': workspace_id + } + } + cosmos_activity_logs_container.create_item(body=activity_record) + + debug_print(f"[ControlCenter] Public Workspace Deleted (Approved) -- workspace_id: {workspace_id}") + + return { + 'success': True, + 'message': 'Public workspace and all documents deleted successfully' + } + + except Exception as e: + debug_print(f"[DELETE_WORKSPACE] Fatal error: {e}") + return {'success': False, 'message': f'Failed to delete workspace: {str(e)}'} + + def _execute_delete_group(approval, executor_id, executor_email, executor_name): + """Execute delete entire group action.""" + try: + group_id = approval['group_id'] + + # First delete all documents + doc_result = _execute_delete_documents(approval, executor_id, executor_email, executor_name) + + # Delete group conversations (optional - could keep for audit) + try: + query = "SELECT * FROM c WHERE c.group_id = @group_id" + parameters = [{"name": "@group_id", "value": group_id}] + + conversations = list(cosmos_group_conversations_container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + for conv in conversations: + cosmos_group_conversations_container.delete_item( + item=conv['id'], + partition_key=group_id + ) + except Exception as conv_error: + debug_print(f"Error deleting conversations: {conv_error}") + + # Delete group messages (optional) + try: + messages = list(cosmos_group_messages_container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + + for msg in messages: + cosmos_group_messages_container.delete_item( + item=msg['id'], + partition_key=group_id + ) + except Exception as msg_error: + debug_print(f"Error deleting messages: {msg_error}") + + # Finally, delete the group itself using proper API + debug_print(f"🗑️ [DELETE GROUP] Deleting group document using delete_group() API") + delete_group(group_id) + debug_print(f"✅ [DELETE GROUP] Group {group_id} successfully deleted") + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'group_deletion', + 'activity_type': 'delete_group_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': approval['requester_id'], + 'requester_email': approval['requester_email'], + 'approver_id': executor_id, + 'approver_email': executor_email, + 'group_id': group_id, + 'group_name': approval['group_name'], + 'approval_id': approval['id'], + 'description': f"Group completely deleted (approved by {executor_email})" + } + cosmos_activity_logs_container.create_item(body=activity_record) + + return { + 'success': True, + 'message': 'Group completely deleted' + } + + except Exception as e: + return {'success': False, 'message': f'Failed to delete group: {str(e)}'} + + def _execute_delete_user_documents(approval, executor_id, executor_email, executor_name): + """Execute delete all user documents action.""" + try: + from functions_documents import delete_document, delete_document_chunks + from utils_cache import invalidate_personal_search_cache + + user_id = approval['metadata'].get('user_id') + user_email = approval['metadata'].get('user_email', 'unknown') + user_name = approval['metadata'].get('user_name', user_email) + + if not user_id: + return {'success': False, 'message': 'User ID not found in approval metadata'} + + # Query all personal documents for this user + # Personal documents are stored in cosmos_user_documents_container with user_id as partition key + query = "SELECT * FROM c WHERE c.user_id = @user_id" + parameters = [{"name": "@user_id", "value": user_id}] + + debug_print(f"🔍 [DELETE_USER_DOCS] Querying for user_id: {user_id}") + debug_print(f"🔍 [DELETE_USER_DOCS] Query: {query}") + debug_print(f"🔍 [DELETE_USER_DOCS] Container: cosmos_user_documents_container") + + documents = list(cosmos_user_documents_container.query_items( + query=query, parameters=parameters, - enable_cross_partition_query=True + partition_key=user_id # Use partition key for efficient query )) - # Apply search filter in Python (after fetching from Cosmos) - if search_term: - filtered_logs = [] - for log in logs: - # Search in various fields - searchable_text = ' '.join([ - str(log.get('activity_type', '')), - str(log.get('user_id', '')), - str(log.get('login_method', '')), - str(log.get('conversation', {}).get('title', '')), - str(log.get('document', {}).get('file_name', '')), - str(log.get('token_type', '')), - str(log.get('workspace_type', '')) - ]).lower() + debug_print(f"📊 [DELETE_USER_DOCS] Found {len(documents)} documents with partition key query") + if len(documents) > 0: + debug_print(f"📄 [DELETE_USER_DOCS] First document sample: id={documents[0].get('id', 'no-id')}, file_name={documents[0].get('file_name', 'no-filename')}, type={documents[0].get('type', 'no-type')}") + else: + # Try a cross-partition query to see if documents exist elsewhere + debug_print(f"⚠️ [DELETE_USER_DOCS] No documents found with partition key, trying cross-partition query...") + documents = list(cosmos_user_documents_container.query_items( + query=query, + parameters=parameters, + enable_cross_partition_query=True + )) + debug_print(f"📊 [DELETE_USER_DOCS] Cross-partition query found {len(documents)} documents") + if len(documents) > 0: + sample_doc = documents[0] + debug_print(f"📄 [DELETE_USER_DOCS] Sample doc fields: {list(sample_doc.keys())}") + debug_print(f"📄 [DELETE_USER_DOCS] Sample doc: id={sample_doc.get('id')}, type={sample_doc.get('type')}, user_id={sample_doc.get('user_id')}, file_name={sample_doc.get('file_name')}") + + deleted_count = 0 + + # Use the existing delete_document function for proper cleanup + for doc in documents: + try: + document_id = doc['id'] + debug_print(f"🗑️ [DELETE_USER_DOCS] Deleting document {document_id}: {doc.get('file_name', 'unknown')}") - if search_term in searchable_text: - filtered_logs.append(log) - - logs = filtered_logs - # Recalculate total_items for filtered results - total_items = len(logs) - total_pages = (total_items + per_page - 1) // per_page if total_items > 0 else 1 + # Use the proper delete_document function which handles: + # - Blob storage deletion + # - AI Search index deletion + # - Cosmos DB document deletion + delete_document(user_id, document_id) + delete_document_chunks(document_id) + + deleted_count += 1 + debug_print(f"✅ [DELETE_USER_DOCS] Successfully deleted document {document_id}") + + except Exception as doc_error: + debug_print(f"❌ [DELETE_USER_DOCS] Error deleting document {doc.get('id')}: {doc_error}") - # Get unique user IDs from logs - user_ids = set(log.get('user_id') for log in logs if log.get('user_id')) + # Invalidate search cache for this user + try: + invalidate_personal_search_cache(user_id) + debug_print(f"🔄 [DELETE_USER_DOCS] Invalidated search cache for user {user_id}") + except Exception as cache_error: + debug_print(f"⚠️ [DELETE_USER_DOCS] Failed to invalidate search cache: {cache_error}") + + # Log to activity logs + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'user_documents_deletion', + 'activity_type': 'delete_all_user_documents_approved', + 'timestamp': datetime.utcnow().isoformat(), + 'requester_id': approval['requester_id'], + 'requester_email': approval['requester_email'], + 'approver_id': executor_id, + 'approver_email': executor_email, + 'target_user_id': user_id, + 'target_user_email': user_email, + 'target_user_name': user_name, + 'documents_deleted': deleted_count, + 'approval_id': approval['id'], + 'description': f"All documents deleted for user {user_name} ({user_email}) - approved by {executor_email}" + } + cosmos_activity_logs_container.create_item(body=activity_record) - # Fetch user information for display names/emails - user_map = {} - if user_ids: - for user_id in user_ids: - try: - user_doc = cosmos_user_settings_container.read_item( - item=user_id, - partition_key=user_id - ) - user_map[user_id] = { - 'email': user_doc.get('email', ''), - 'display_name': user_doc.get('display_name', '') - } - except: - user_map[user_id] = { - 'email': '', - 'display_name': '' - } + # Log to AppInsights + log_event("[ControlCenter] User Documents Deleted (Approved)", { + "executor": executor_email, + "user_id": user_id, + "user_email": user_email, + "documents_deleted": deleted_count, + "approval_id": approval['id'] + }) - return jsonify({ - 'logs': logs, - 'user_map': user_map, - 'pagination': { - 'page': page, - 'per_page': per_page, - 'total_items': total_items, - 'total_pages': total_pages, - 'has_prev': page > 1, - 'has_next': page < total_pages - } - }), 200 + return { + 'success': True, + 'message': f'Deleted {deleted_count} documents for user {user_name}' + } except Exception as e: - current_app.logger.error(f"Error getting activity logs: {e}") - import traceback - traceback.print_exc() + debug_print(f"Error deleting user documents: {e}") + return {'success': False, 'message': f'Failed to delete user documents: {str(e)}'} + return jsonify({'error': 'Failed to retrieve activity logs'}), 500 \ No newline at end of file diff --git a/application/single_app/route_backend_documents.py b/application/single_app/route_backend_documents.py index 74846793..072577d6 100644 --- a/application/single_app/route_backend_documents.py +++ b/application/single_app/route_backend_documents.py @@ -6,7 +6,7 @@ from functions_settings import * from utils_cache import invalidate_personal_search_cache from functions_debug import * -from functions_activity_logging import log_document_upload +from functions_activity_logging import log_document_upload, log_document_metadata_update_transaction import os import requests from flask import current_app @@ -24,6 +24,7 @@ def get_file_content(): user_id = get_current_user_id() conversation_id = data.get('conversation_id') file_id = data.get('file_id') + debug_print(f"[GET_FILE_CONTENT] Starting - user_id={user_id}, conversation_id={conversation_id}, file_id={file_id}") if not user_id: @@ -458,6 +459,9 @@ def api_patch_user_document(document_id): return jsonify({'error': 'User not authenticated'}), 401 data = request.get_json() # new metadata values from the client + + # Track which fields were updated + updated_fields = {} # Update allowed fields # You can decide which fields can be updated from the client @@ -467,12 +471,14 @@ def api_patch_user_document(document_id): user_id=user_id, title=data['title'] ) + updated_fields['title'] = data['title'] if 'abstract' in data: update_document( document_id=document_id, user_id=user_id, abstract=data['abstract'] ) + updated_fields['abstract'] = data['abstract'] if 'keywords' in data: # Expect a list or a comma-delimited string if isinstance(data['keywords'], list): @@ -481,25 +487,30 @@ def api_patch_user_document(document_id): user_id=user_id, keywords=data['keywords'] ) + updated_fields['keywords'] = data['keywords'] else: # if client sends a comma-separated string of keywords + keywords_list = [kw.strip() for kw in data['keywords'].split(',')] update_document( document_id=document_id, user_id=user_id, - keywords=[kw.strip() for kw in data['keywords'].split(',')] + keywords=keywords_list ) + updated_fields['keywords'] = keywords_list if 'publication_date' in data: update_document( document_id=document_id, user_id=user_id, publication_date=data['publication_date'] ) + updated_fields['publication_date'] = data['publication_date'] if 'document_classification' in data: update_document( document_id=document_id, user_id=user_id, document_classification=data['document_classification'] ) + updated_fields['document_classification'] = data['document_classification'] # Add authors if you want to allow editing that if 'authors' in data: # if you want a list, or just store a string @@ -510,15 +521,32 @@ def api_patch_user_document(document_id): user_id=user_id, authors=data['authors'] ) + updated_fields['authors'] = data['authors'] else: + authors_list = [data['authors']] update_document( document_id=document_id, user_id=user_id, - authors=[data['authors']] + authors=authors_list ) + updated_fields['authors'] = authors_list # Save updates back to Cosmos try: + # Log the metadata update transaction if any fields were updated + if updated_fields: + # Get document details for logging + doc = get_document(user_id, document_id) + if doc: + log_document_metadata_update_transaction( + user_id=user_id, + document_id=document_id, + workspace_type='personal', + file_name=doc.get('file_name', 'Unknown'), + updated_fields=updated_fields, + file_type=doc.get('file_type') + ) + return jsonify({'message': 'Document metadata updated successfully'}), 200 except Exception as e: return jsonify({'error': str(e)}), 500 diff --git a/application/single_app/route_backend_feedback.py b/application/single_app/route_backend_feedback.py index bf526f60..49167cc8 100644 --- a/application/single_app/route_backend_feedback.py +++ b/application/single_app/route_backend_feedback.py @@ -141,7 +141,7 @@ def feedback_submit(): @app.route("/feedback/review", methods=["GET"]) @swagger_route(security=get_auth_security()) @login_required - @admin_required + @feedback_admin_required @enabled_required("enable_user_feedback") def feedback_review_get(): """ @@ -247,7 +247,7 @@ def feedback_review_get(): @app.route("/feedback/review/", methods=["GET"]) @swagger_route(security=get_auth_security()) @login_required - @admin_required + @feedback_admin_required @enabled_required("enable_user_feedback") def feedback_review_get_single(feedbackId): """ @@ -283,7 +283,7 @@ def feedback_review_get_single(feedbackId): @app.route("/feedback/review/", methods=["PATCH"]) @swagger_route(security=get_auth_security()) @login_required - @admin_required + @feedback_admin_required @enabled_required("enable_user_feedback") def feedback_review_update(feedbackId): """ @@ -328,7 +328,7 @@ def feedback_review_update(feedbackId): @app.route("/feedback/retest/", methods=["POST"]) @swagger_route(security=get_auth_security()) @login_required - @admin_required + @feedback_admin_required @enabled_required("enable_user_feedback") def feedback_retest(feedbackId): """ diff --git a/application/single_app/route_backend_group_documents.py b/application/single_app/route_backend_group_documents.py index 805cf3c2..194b5a6b 100644 --- a/application/single_app/route_backend_group_documents.py +++ b/application/single_app/route_backend_group_documents.py @@ -42,6 +42,12 @@ def api_upload_group_document(): if not group_doc: return jsonify({'error': 'Active group not found'}), 404 + # Check if group status allows uploads + from functions_group import check_group_status_allows_operation + allowed, reason = check_group_status_allows_operation(group_doc, 'upload') + if not allowed: + return jsonify({'error': reason}), 403 + role = get_user_role_in_group(group_doc, user_id) if role not in ["Owner", "Admin", "DocumentManager"]: return jsonify({'error': 'You do not have permission to upload documents'}), 403 @@ -336,6 +342,9 @@ def api_patch_group_document(document_id): return jsonify({'error': 'You do not have permission to update documents in this group'}), 403 data = request.get_json() + + # Track which fields were updated + updated_fields = {} try: if 'title' in data: @@ -345,6 +354,7 @@ def api_patch_group_document(document_id): user_id=user_id, title=data['title'] ) + updated_fields['title'] = data['title'] if 'abstract' in data: update_document( document_id=document_id, @@ -352,6 +362,7 @@ def api_patch_group_document(document_id): user_id=user_id, abstract=data['abstract'] ) + updated_fields['abstract'] = data['abstract'] if 'keywords' in data: if isinstance(data['keywords'], list): update_document( @@ -360,13 +371,16 @@ def api_patch_group_document(document_id): user_id=user_id, keywords=data['keywords'] ) + updated_fields['keywords'] = data['keywords'] else: + keywords_list = [kw.strip() for kw in data['keywords'].split(',')] update_document( document_id=document_id, group_id=active_group_id, user_id=user_id, - keywords=[kw.strip() for kw in data['keywords'].split(',')] + keywords=keywords_list ) + updated_fields['keywords'] = keywords_list if 'publication_date' in data: update_document( document_id=document_id, @@ -374,6 +388,7 @@ def api_patch_group_document(document_id): user_id=user_id, publication_date=data['publication_date'] ) + updated_fields['publication_date'] = data['publication_date'] if 'document_classification' in data: update_document( document_id=document_id, @@ -381,6 +396,7 @@ def api_patch_group_document(document_id): user_id=user_id, document_classification=data['document_classification'] ) + updated_fields['document_classification'] = data['document_classification'] if 'authors' in data: if isinstance(data['authors'], list): update_document( @@ -389,12 +405,32 @@ def api_patch_group_document(document_id): user_id=user_id, authors=data['authors'] ) + updated_fields['authors'] = data['authors'] else: + authors_list = [data['authors']] update_document( document_id=document_id, group_id=active_group_id, user_id=user_id, - authors=[data['authors']] + authors=authors_list + ) + updated_fields['authors'] = authors_list + + # Log the metadata update transaction if any fields were updated + if updated_fields: + # Get document details for logging + from functions_documents import get_document + doc = get_document(user_id, document_id, group_id=active_group_id) + if doc: + from functions_activity_logging import log_document_metadata_update_transaction + log_document_metadata_update_transaction( + user_id=user_id, + document_id=document_id, + workspace_type='group', + file_name=doc.get('file_name', 'Unknown'), + updated_fields=updated_fields, + file_type=doc.get('file_type'), + group_id=active_group_id ) return jsonify({'message': 'Group document metadata updated successfully'}), 200 @@ -425,6 +461,12 @@ def api_delete_group_document(document_id): if not group_doc: return jsonify({'error': 'Active group not found'}), 404 + # Check if group status allows deletions + from functions_group import check_group_status_allows_operation + allowed, reason = check_group_status_allows_operation(group_doc, 'delete') + if not allowed: + return jsonify({'error': reason}), 403 + role = get_user_role_in_group(group_doc, user_id) if role not in ["Owner", "Admin", "DocumentManager"]: return jsonify({'error': 'You do not have permission to delete documents in this group'}), 403 diff --git a/application/single_app/route_backend_groups.py b/application/single_app/route_backend_groups.py index b23ad7fb..0e35d211 100644 --- a/application/single_app/route_backend_groups.py +++ b/application/single_app/route_backend_groups.py @@ -3,6 +3,8 @@ from config import * from functions_authentication import * from functions_group import * +from functions_debug import debug_print +from functions_notifications import create_notification from swagger_wrapper import swagger_route, get_auth_security def register_route_backend_groups(app): @@ -112,7 +114,8 @@ def api_list_groups(): "name": g.get("name", "Untitled Group"), # Provide default name "description": g.get("description", ""), "userRole": role, - "isActive": (g["id"] == db_active_group_id) + "isActive": (g["id"] == db_active_group_id), + "status": g.get("status", "active") # Include group status }) return jsonify({ @@ -384,6 +387,7 @@ def add_member_directly(group_id): """ user_info = get_current_user_info() user_id = user_info["userId"] + user_email = user_info.get("email", "unknown") group_doc = find_group_by_id(group_id) @@ -402,16 +406,80 @@ def add_member_directly(group_id): if get_user_role_in_group(group_doc, new_user_id): return jsonify({"error": "User is already a member"}), 400 + # Get role from request, default to 'user' + member_role = data.get("role", "user").lower() + + # Validate role + valid_roles = ['admin', 'document_manager', 'user'] + if member_role not in valid_roles: + return jsonify({"error": f"Invalid role. Must be: {', '.join(valid_roles)}"}), 400 + new_member_doc = { "userId": new_user_id, "email": data.get("email", ""), "displayName": data.get("displayName", "New User") } group_doc["users"].append(new_member_doc) + + # Add to appropriate role array + if member_role == 'admin': + if new_user_id not in group_doc.get('admins', []): + group_doc.setdefault('admins', []).append(new_user_id) + elif member_role == 'document_manager': + if new_user_id not in group_doc.get('documentManagers', []): + group_doc.setdefault('documentManagers', []).append(new_user_id) + group_doc["modifiedDate"] = datetime.utcnow().isoformat() cosmos_groups_container.upsert_item(group_doc) - return jsonify({"message": "Member added"}), 200 + + # Log activity for member addition + try: + activity_record = { + 'id': str(uuid.uuid4()), + 'activity_type': 'add_member_directly', + 'timestamp': datetime.utcnow().isoformat(), + 'added_by_user_id': user_id, + 'added_by_email': user_email, + 'added_by_role': role, + 'group_id': group_id, + 'group_name': group_doc.get('name', 'Unknown'), + 'member_user_id': new_user_id, + 'member_email': new_member_doc.get('email', ''), + 'member_name': new_member_doc.get('displayName', ''), + 'member_role': member_role, + 'description': f"{role} {user_email} added member {new_member_doc.get('displayName', '')} ({new_member_doc.get('email', '')}) to group {group_doc.get('name', group_id)} as {member_role}" + } + cosmos_activity_logs_container.create_item(body=activity_record) + except Exception as log_error: + debug_print(f"Failed to log member addition activity: {log_error}") + + # Create notification for the new member + try: + from functions_notifications import create_notification + role_display = { + 'admin': 'Admin', + 'document_manager': 'Document Manager', + 'user': 'Member' + }.get(member_role, 'Member') + + create_notification( + user_id=new_user_id, + notification_type='system_announcement', + title='Added to Group', + message=f"You have been added to the group '{group_doc.get('name', 'Unknown')}' as {role_display} by {user_email}.", + link_url=f"/manage_group/{group_id}", + metadata={ + 'group_id': group_id, + 'group_name': group_doc.get('name', 'Unknown'), + 'added_by': user_email, + 'role': member_role + } + ) + except Exception as notif_error: + debug_print(f"Failed to create member addition notification: {notif_error}") + + return jsonify({"message": "Member added", "success": True}), 200 @app.route("/api/groups//members/", methods=["DELETE"]) @swagger_route(security=get_auth_security()) @@ -439,10 +507,12 @@ def remove_member(group_id, member_id): "Transfer ownership or delete the group."}), 403 removed = False + removed_member_info = None updated_users = [] for u in group_doc["users"]: if u["userId"] == member_id: removed = True + removed_member_info = u continue updated_users.append(u) @@ -457,6 +527,26 @@ def remove_member(group_id, member_id): cosmos_groups_container.upsert_item(group_doc) if removed: + # Log activity for self-removal + from functions_activity_logging import log_group_member_deleted + user_email = user_info.get("email", "unknown") + member_name = removed_member_info.get('displayName', '') if removed_member_info else '' + member_email = removed_member_info.get('email', '') if removed_member_info else '' + description = f"Member {user_email} left group {group_doc.get('name', group_id)}" + + log_group_member_deleted( + removed_by_user_id=user_id, + removed_by_email=user_email, + removed_by_role='Member', + member_user_id=member_id, + member_email=member_email, + member_name=member_name, + group_id=group_id, + group_name=group_doc.get('name', 'Unknown'), + action='member_left_group', + description=description + ) + return jsonify({"message": "You have left the group"}), 200 else: return jsonify({"error": "You are not in this group"}), 404 @@ -470,10 +560,12 @@ def remove_member(group_id, member_id): return jsonify({"error": "Cannot remove the group owner"}), 403 removed = False + removed_member_info = None updated_users = [] for u in group_doc["users"]: if u["userId"] == member_id: removed = True + removed_member_info = u continue updated_users.append(u) group_doc["users"] = updated_users @@ -487,6 +579,26 @@ def remove_member(group_id, member_id): cosmos_groups_container.upsert_item(group_doc) if removed: + # Log activity for admin/owner removal + from functions_activity_logging import log_group_member_deleted + user_email = user_info.get("email", "unknown") + member_name = removed_member_info.get('displayName', '') if removed_member_info else '' + member_email = removed_member_info.get('email', '') if removed_member_info else '' + description = f"{role} {user_email} removed member {member_name} ({member_email}) from group {group_doc.get('name', group_id)}" + + log_group_member_deleted( + removed_by_user_id=user_id, + removed_by_email=user_email, + removed_by_role=role, + member_user_id=member_id, + member_email=member_email, + member_name=member_name, + group_id=group_id, + group_name=group_doc.get('name', 'Unknown'), + action='admin_removed_member', + description=description + ) + return jsonify({"message": "User removed"}), 200 else: return jsonify({"error": "User not found in group"}), 404 @@ -505,6 +617,7 @@ def update_member_role(group_id, member_id): """ user_info = get_current_user_info() user_id = user_info["userId"] + user_email = user_info.get("email", "unknown") group_doc = find_group_by_id(group_id) @@ -524,6 +637,15 @@ def update_member_role(group_id, member_id): if not target_role: return jsonify({"error": "Member is not in the group"}), 404 + # Get member details for logging + member_name = "Unknown" + member_email = "unknown" + for u in group_doc.get("users", []): + if u.get("userId") == member_id: + member_name = u.get("displayName", "Unknown") + member_email = u.get("email", "unknown") + break + if member_id in group_doc.get("admins", []): group_doc["admins"].remove(member_id) if member_id in group_doc.get("documentManagers", []): @@ -539,6 +661,49 @@ def update_member_role(group_id, member_id): group_doc["modifiedDate"] = datetime.utcnow().isoformat() cosmos_groups_container.upsert_item(group_doc) + # Log activity for role change + try: + activity_record = { + 'id': str(uuid.uuid4()), + 'type': 'group_member_role_changed', + 'activity_type': 'update_member_role', + 'timestamp': datetime.utcnow().isoformat(), + 'changed_by_user_id': user_id, + 'changed_by_email': user_email, + 'changed_by_role': current_role, + 'group_id': group_id, + 'group_name': group_doc.get('name', 'Unknown'), + 'member_user_id': member_id, + 'member_email': member_email, + 'member_name': member_name, + 'old_role': target_role, + 'new_role': new_role, + 'description': f"{current_role} {user_email} changed {member_name} ({member_email}) role from {target_role} to {new_role} in group {group_doc.get('name', group_id)}" + } + cosmos_activity_logs_container.create_item(body=activity_record) + except Exception as log_error: + debug_print(f"Failed to log role change activity: {log_error}") + + # Create notification for the member whose role was changed + try: + from functions_notifications import create_notification + create_notification( + user_id=member_id, + notification_type='system_announcement', + title='Role Changed', + message=f"Your role in group '{group_doc.get('name', 'Unknown')}' has been changed from {target_role} to {new_role} by {user_email}.", + link_url=f"/manage_group/{group_id}", + metadata={ + 'group_id': group_id, + 'group_name': group_doc.get('name', 'Unknown'), + 'changed_by': user_email, + 'old_role': target_role, + 'new_role': new_role + } + ) + except Exception as notif_error: + debug_print(f"Failed to create role change notification: {notif_error}") + return jsonify({"message": f"User {member_id} updated to {new_role}"}), 200 @app.route("/api/groups//members", methods=["GET"]) @@ -705,3 +870,260 @@ def get_group_file_count(group_id): file_count = item return jsonify({ "fileCount": file_count }), 200 + + @app.route("/api/groups//activity", methods=["GET"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + @enabled_required("enable_group_workspaces") + def api_group_activity(group_id): + """ + GET /api/groups//activity + Returns recent activity timeline for the group. + Only accessible by owner and admins. + """ + from functions_debug import debug_print + + info = get_current_user_info() + user_id = info["userId"] + + group = find_group_by_id(group_id) + if not group: + return jsonify({"error": "Not found"}), 404 + + # Check user is owner or admin (NOT document managers or regular members) + is_owner = group["owner"]["id"] == user_id + is_admin = user_id in (group.get("admins", [])) + + if not (is_owner or is_admin): + return jsonify({"error": "Forbidden - Only group owners and admins can view activity timeline"}), 403 + + # Get pagination parameters + limit = request.args.get('limit', 50, type=int) + if limit not in [10, 20, 50]: + limit = 50 + + # Get recent activity + query = f""" + SELECT TOP {limit} * + FROM a + WHERE a.workspace_context.group_id = @groupId + ORDER BY a.timestamp DESC + """ + params = [{"name": "@groupId", "value": group_id}] + + debug_print(f"[GROUP_ACTIVITY] Group ID: {group_id}") + debug_print(f"[GROUP_ACTIVITY] Query: {query}") + debug_print(f"[GROUP_ACTIVITY] Params: {params}") + + activities = [] + try: + activity_iter = cosmos_activity_logs_container.query_items( + query=query, + parameters=params, + enable_cross_partition_query=True + ) + activities = list(activity_iter) + debug_print(f"[GROUP_ACTIVITY] Found {len(activities)} activity records") + except Exception as e: + debug_print(f"[GROUP_ACTIVITY] Error querying activity: {e}") + return jsonify({"error": "Failed to retrieve activity"}), 500 + + return jsonify(activities), 200 + + @app.route("/api/groups//stats", methods=["GET"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + @enabled_required("enable_group_workspaces") + def api_group_stats(group_id): + """ + GET /api/groups//stats + Returns statistics for the group including documents, storage, tokens, and members. + Only accessible by owner and admins. + """ + from functions_debug import debug_print + from datetime import datetime, timedelta + + info = get_current_user_info() + user_id = info["userId"] + + group = find_group_by_id(group_id) + if not group: + return jsonify({"error": "Not found"}), 404 + + # Check user is owner or admin + is_owner = group["owner"]["id"] == user_id + is_admin = user_id in (group.get("admins", [])) + + if not (is_owner or is_admin): + return jsonify({"error": "Forbidden"}), 403 + + # Get metrics from group record + metrics = group.get("metrics", {}) + document_metrics = metrics.get("document_metrics", {}) + + total_documents = document_metrics.get("total_documents", 0) + storage_used = document_metrics.get("storage_account_size", 0) + ai_search_size = document_metrics.get("ai_search_size", 0) + storage_account_size = document_metrics.get("storage_account_size", 0) + + # Get member count + total_members = len(group.get("users", [])) + + # Get token usage from activity logs (last 30 days) + thirty_days_ago = (datetime.utcnow() - timedelta(days=30)).isoformat() + + debug_print(f"[GROUP_STATS] Group ID: {group_id}") + debug_print(f"[GROUP_STATS] Start date: {thirty_days_ago}") + + token_query = """ + SELECT a.usage + FROM a + WHERE a.workspace_context.group_id = @groupId + AND a.timestamp >= @startDate + AND a.activity_type = 'token_usage' + """ + token_params = [ + {"name": "@groupId", "value": group_id}, + {"name": "@startDate", "value": thirty_days_ago} + ] + + total_tokens = 0 + try: + token_iter = cosmos_activity_logs_container.query_items( + query=token_query, + parameters=token_params, + enable_cross_partition_query=True + ) + for item in token_iter: + usage = item.get("usage", {}) + total_tokens += usage.get("total_tokens", 0) + debug_print(f"[GROUP_STATS] Total tokens accumulated: {total_tokens}") + except Exception as e: + debug_print(f"[GROUP_STATS] Error querying total tokens: {e}") + + # Get activity data for charts (last 30 days) + doc_activity_labels = [] + doc_upload_data = [] + doc_delete_data = [] + token_usage_labels = [] + token_usage_data = [] + + # Generate labels for last 30 days + for i in range(29, -1, -1): + date = datetime.utcnow() - timedelta(days=i) + doc_activity_labels.append(date.strftime("%m/%d")) + token_usage_labels.append(date.strftime("%m/%d")) + doc_upload_data.append(0) + doc_delete_data.append(0) + token_usage_data.append(0) + + # Get document upload activity by day + doc_upload_query = """ + SELECT a.timestamp, a.created_at + FROM a + WHERE a.workspace_context.group_id = @groupId + AND a.timestamp >= @startDate + AND a.activity_type = 'document_creation' + """ + try: + activity_iter = cosmos_activity_logs_container.query_items( + query=doc_upload_query, + parameters=token_params, + enable_cross_partition_query=True + ) + for item in activity_iter: + timestamp = item.get("timestamp") or item.get("created_at") + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + day_date = dt.strftime("%m/%d") + if day_date in doc_activity_labels: + idx = doc_activity_labels.index(day_date) + doc_upload_data[idx] += 1 + except Exception as e: + debug_print(f"[GROUP_STATS] Error parsing timestamp: {e}") + except Exception as e: + debug_print(f"[GROUP_STATS] Error querying document uploads: {e}") + + # Get document delete activity by day + doc_delete_query = """ + SELECT a.timestamp, a.created_at + FROM a + WHERE a.workspace_context.group_id = @groupId + AND a.timestamp >= @startDate + AND a.activity_type = 'document_deletion' + """ + try: + delete_iter = cosmos_activity_logs_container.query_items( + query=doc_delete_query, + parameters=token_params, + enable_cross_partition_query=True + ) + for item in delete_iter: + timestamp = item.get("timestamp") or item.get("created_at") + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + day_date = dt.strftime("%m/%d") + if day_date in doc_activity_labels: + idx = doc_activity_labels.index(day_date) + doc_delete_data[idx] += 1 + except Exception as e: + debug_print(f"[GROUP_STATS] Error parsing timestamp: {e}") + except Exception as e: + debug_print(f"[GROUP_STATS] Error querying document deletes: {e}") + + # Get token usage by day + token_activity_query = """ + SELECT a.timestamp, a.created_at, a.usage + FROM a + WHERE a.workspace_context.group_id = @groupId + AND a.timestamp >= @startDate + AND a.activity_type = 'token_usage' + """ + try: + token_activity_iter = cosmos_activity_logs_container.query_items( + query=token_activity_query, + parameters=token_params, + enable_cross_partition_query=True + ) + for item in token_activity_iter: + timestamp = item.get("timestamp") or item.get("created_at") + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + day_date = dt.strftime("%m/%d") + if day_date in token_usage_labels: + idx = token_usage_labels.index(day_date) + usage = item.get("usage", {}) + tokens = usage.get("total_tokens", 0) + token_usage_data[idx] += tokens + except Exception as e: + debug_print(f"[GROUP_STATS] Error parsing timestamp: {e}") + except Exception as e: + debug_print(f"[GROUP_STATS] Error querying token usage: {e}") + + stats = { + "totalDocuments": total_documents, + "storageUsed": storage_used, + "storageLimit": 10737418240, # 10GB default + "totalTokens": total_tokens, + "totalMembers": total_members, + "storage": { + "ai_search_size": ai_search_size, + "storage_account_size": storage_account_size + }, + "documentActivity": { + "labels": doc_activity_labels, + "uploads": doc_upload_data, + "deletes": doc_delete_data + }, + "tokenUsage": { + "labels": token_usage_labels, + "data": token_usage_data + } + } + + return jsonify(stats), 200 diff --git a/application/single_app/route_backend_models.py b/application/single_app/route_backend_models.py index e0859453..176d112c 100644 --- a/application/single_app/route_backend_models.py +++ b/application/single_app/route_backend_models.py @@ -13,9 +13,7 @@ def register_route_backend_models(app): """ @app.route('/api/models/gpt', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def get_gpt_models(): @@ -75,9 +73,7 @@ def get_gpt_models(): @app.route('/api/models/embedding', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def get_embedding_models(): @@ -135,9 +131,7 @@ def get_embedding_models(): @app.route('/api/models/image', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def get_image_models(): diff --git a/application/single_app/route_backend_notifications.py b/application/single_app/route_backend_notifications.py new file mode 100644 index 00000000..8fe8dd58 --- /dev/null +++ b/application/single_app/route_backend_notifications.py @@ -0,0 +1,210 @@ +# route_backend_notifications.py + +from config import * +from functions_authentication import * +from functions_settings import * +from functions_notifications import * +from swagger_wrapper import swagger_route, get_auth_security +from functions_debug import debug_print + +def register_route_backend_notifications(app): + + @app.route("/api/notifications", methods=["GET"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def api_get_notifications(): + """ + Get paginated notifications for the current user. + + Query Parameters: + page (int): Page number (default: 1) + per_page (int): Items per page (default: 20) + include_read (bool): Include read notifications (default: true) + include_dismissed (bool): Include dismissed notifications (default: false) + """ + try: + user_id = get_current_user_id() + user = session.get('user', {}) + user_roles = user.get('roles', []) + + # Get query parameters + page = int(request.args.get('page', 1)) + per_page = int(request.args.get('per_page', 20)) + include_read = request.args.get('include_read', 'true').lower() == 'true' + include_dismissed = request.args.get('include_dismissed', 'false').lower() == 'true' + + # Validate per_page + if per_page not in [10, 20, 50]: + per_page = 20 + + result = get_user_notifications( + user_id=user_id, + page=page, + per_page=per_page, + include_read=include_read, + include_dismissed=include_dismissed, + user_roles=user_roles + ) + + return jsonify({ + 'success': True, + **result + }) + + except Exception as e: + debug_print(f"Error fetching notifications: {e}") + return jsonify({ + 'success': False, + 'error': 'Failed to fetch notifications' + }), 500 + + @app.route("/api/notifications/count", methods=["GET"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def api_get_notification_count(): + """ + Get count of unread notifications for the current user. + """ + try: + user_id = get_current_user_id() + count = get_unread_notification_count(user_id) + + return jsonify({ + 'success': True, + 'count': count + }) + + except Exception as e: + debug_print(f"Error fetching notification count: {e}") + return jsonify({ + 'success': False, + 'count': 0 + }), 500 + + @app.route("/api/notifications//read", methods=["POST"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def api_mark_notification_read(notification_id): + """ + Mark a notification as read. + """ + try: + user_id = get_current_user_id() + success = mark_notification_read(notification_id, user_id) + + if success: + return jsonify({ + 'success': True, + 'message': 'Notification marked as read' + }) + else: + return jsonify({ + 'success': False, + 'error': 'Failed to mark notification as read' + }), 400 + + except Exception as e: + debug_print(f"Error marking notification as read: {e}") + return jsonify({ + 'success': False, + 'error': 'Internal server error' + }), 500 + + @app.route("/api/notifications//dismiss", methods=["DELETE"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def api_dismiss_notification(notification_id): + """ + Dismiss a notification. + """ + try: + user_id = get_current_user_id() + success = dismiss_notification(notification_id, user_id) + + if success: + return jsonify({ + 'success': True, + 'message': 'Notification dismissed' + }) + else: + return jsonify({ + 'success': False, + 'error': 'Failed to dismiss notification' + }), 400 + + except Exception as e: + debug_print(f"Error dismissing notification: {e}") + return jsonify({ + 'success': False, + 'error': 'Internal server error' + }), 500 + + @app.route("/api/notifications/mark-all-read", methods=["POST"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def api_mark_all_read(): + """ + Mark all notifications as read for the current user. + """ + try: + user_id = get_current_user_id() + count = mark_all_read(user_id) + + return jsonify({ + 'success': True, + 'message': f'{count} notifications marked as read', + 'count': count + }) + + except Exception as e: + debug_print(f"Error marking all notifications as read: {e}") + return jsonify({ + 'success': False, + 'error': 'Internal server error' + }), 500 + + @app.route("/api/notifications/settings", methods=["POST"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def api_update_notification_settings(): + """ + Update notification settings for the current user. + + Body: + notifications_per_page (int): Number of notifications per page (10, 20, or 50) + """ + try: + user_id = get_current_user_id() + data = request.get_json() + + per_page = data.get('notifications_per_page', 20) + + # Validate per_page + if per_page not in [10, 20, 50]: + return jsonify({ + 'success': False, + 'error': 'Invalid per_page value. Must be 10, 20, or 50.' + }), 400 + + # Update user settings + update_user_settings(user_id, { + 'notifications_per_page': per_page + }) + + return jsonify({ + 'success': True, + 'message': 'Settings updated' + }) + + except Exception as e: + debug_print(f"Error updating notification settings: {e}") + return jsonify({ + 'success': False, + 'error': 'Internal server error' + }), 500 diff --git a/application/single_app/route_backend_plugins.py b/application/single_app/route_backend_plugins.py index 51f0c6a0..edd53dbd 100644 --- a/application/single_app/route_backend_plugins.py +++ b/application/single_app/route_backend_plugins.py @@ -11,7 +11,7 @@ from swagger_wrapper import swagger_route, get_auth_security import logging import os - +from functions_debug import debug_print import importlib.util from functions_plugins import get_merged_plugin_settings from semantic_kernel_plugins.base_plugin import BasePlugin @@ -342,7 +342,7 @@ def set_user_plugins(): delete_personal_action(user_id, plugin_name) except Exception as e: - current_app.logger.error(f"Error saving personal actions for user {user_id}: {e}") + debug_print(f"Error saving personal actions for user {user_id}: {e}") return jsonify({'error': 'Failed to save plugins'}), 500 log_event("User plugins updated", extra={"user_id": user_id, "plugins_count": len(filtered_plugins)}) return jsonify({'success': True}) @@ -460,7 +460,7 @@ def create_group_action_route(): try: saved = save_group_action(active_group, payload) except Exception as exc: - current_app.logger.error('Failed to save group action: %s', exc) + debug_print('Failed to save group action: %s', exc) return jsonify({'error': 'Unable to save action'}), 500 return jsonify(saved), 201 @@ -513,7 +513,7 @@ def update_group_action_route(action_id): try: saved = save_group_action(active_group, merged) except Exception as exc: - current_app.logger.error('Failed to update group action %s: %s', action_id, exc) + debug_print('Failed to update group action %s: %s', action_id, exc) return jsonify({'error': 'Unable to update action'}), 500 return jsonify(saved), 200 @@ -539,7 +539,7 @@ def delete_group_action_route(action_id): try: removed = delete_group_action(active_group, action_id) except Exception as exc: - current_app.logger.error('Failed to delete group action %s: %s', action_id, exc) + debug_print('Failed to delete group action %s: %s', action_id, exc) return jsonify({'error': 'Unable to delete action'}), 500 if not removed: diff --git a/application/single_app/route_backend_public_documents.py b/application/single_app/route_backend_public_documents.py index 319a1e1b..9e228acd 100644 --- a/application/single_app/route_backend_public_documents.py +++ b/application/single_app/route_backend_public_documents.py @@ -35,8 +35,10 @@ def api_upload_public_document(): if not ws_doc: return jsonify({'error': 'Active public workspace not found'}), 404 - # check role - from functions_public_workspaces import get_user_role_in_public_workspace + allowed, reason = check_public_workspace_status_allows_operation(ws_doc, 'upload') + if not allowed: + return jsonify({'error': reason}), 403 + role = get_user_role_in_public_workspace(ws_doc, user_id) if role not in ['Owner', 'Admin', 'DocumentManager']: return jsonify({'error': 'Insufficient permissions'}), 403 @@ -271,21 +273,48 @@ def api_patch_public_document(doc_id): if role not in ['Owner','Admin','DocumentManager']: return jsonify({'error':'Access denied'}), 403 data = request.get_json() or {} + + # Track which fields were updated + updated_fields = {} + try: if 'title' in data: update_document(document_id=doc_id, public_workspace_id=active_ws, user_id=user_id, title=data['title']) + updated_fields['title'] = data['title'] if 'abstract' in data: update_document(document_id=doc_id, public_workspace_id=active_ws, user_id=user_id, abstract=data['abstract']) + updated_fields['abstract'] = data['abstract'] if 'keywords' in data: kws = data['keywords'] if isinstance(data['keywords'],list) else [k.strip() for k in data['keywords'].split(',')] update_document(document_id=doc_id, public_workspace_id=active_ws, user_id=user_id, keywords=kws) + updated_fields['keywords'] = kws if 'authors' in data: auths = data['authors'] if isinstance(data['authors'],list) else [data['authors']] update_document(document_id=doc_id, public_workspace_id=active_ws, user_id=user_id, authors=auths) + updated_fields['authors'] = auths if 'publication_date' in data: update_document(document_id=doc_id, public_workspace_id=active_ws, user_id=user_id, publication_date=data['publication_date']) + updated_fields['publication_date'] = data['publication_date'] if 'document_classification' in data: update_document(document_id=doc_id, public_workspace_id=active_ws, user_id=user_id, document_classification=data['document_classification']) + updated_fields['document_classification'] = data['document_classification'] + + # Log the metadata update transaction if any fields were updated + if updated_fields: + from functions_documents import get_document + from functions_activity_logging import log_document_metadata_update_transaction + doc = get_document(user_id, doc_id, public_workspace_id=active_ws) + if doc: + log_document_metadata_update_transaction( + user_id=user_id, + document_id=doc_id, + workspace_type='public', + file_name=doc.get('file_name', 'Unknown'), + updated_fields=updated_fields, + file_type=doc.get('file_type'), + public_workspace_id=active_ws + ) + return jsonify({'message':'Metadata updated'}), 200 except Exception as e: return jsonify({'error':str(e)}), 500 @@ -300,6 +329,14 @@ def api_delete_public_document(doc_id): settings = get_user_settings(user_id) active_ws = settings['settings'].get('activePublicWorkspaceOid') ws_doc = find_public_workspace_by_id(active_ws) if active_ws else None + + # Check if workspace status allows deletions + if ws_doc: + from functions_public_workspaces import check_public_workspace_status_allows_operation + allowed, reason = check_public_workspace_status_allows_operation(ws_doc, 'delete') + if not allowed: + return jsonify({'error': reason}), 403 + from functions_public_workspaces import get_user_role_in_public_workspace role = get_user_role_in_public_workspace(ws_doc, user_id) if ws_doc else None if role not in ['Owner','Admin','DocumentManager']: diff --git a/application/single_app/route_backend_public_workspaces.py b/application/single_app/route_backend_public_workspaces.py index d9383bfd..bce82787 100644 --- a/application/single_app/route_backend_public_workspaces.py +++ b/application/single_app/route_backend_public_workspaces.py @@ -3,9 +3,37 @@ from config import * from functions_authentication import * from functions_public_workspaces import * +from functions_notifications import create_notification from swagger_wrapper import swagger_route, get_auth_security +from functions_debug import debug_print +def is_user_in_admins(user_id, admins_list): + """ + Check if user is in admins list (supports both old format ["id1", "id2"] and new format [{userId, email, displayName}]) + """ + if not admins_list: + return False + for admin in admins_list: + if isinstance(admin, str): + if admin == user_id: + return True + elif isinstance(admin, dict): + if admin.get("userId") == user_id: + return True + return False + +def remove_user_from_admins(user_id, admins_list): + """ + Remove user from admins list (supports both old and new format) + Returns updated admins list + """ + if not admins_list: + return [] + return [admin for admin in admins_list if + (isinstance(admin, str) and admin != user_id) or + (isinstance(admin, dict) and admin.get("userId") != user_id)] + def get_user_details_from_graph(user_id): """ Get user details (displayName, email) from Microsoft Graph API by user ID. @@ -147,6 +175,7 @@ def api_list_public_workspaces(): "name": ws.get("name", ""), "description": ws.get("description", ""), "userRole": role, + "status": ws.get("status", "active"), "isActive": (ws["id"] == active_id) }) @@ -201,7 +230,7 @@ def api_get_public_workspace(ws_id): def api_update_public_workspace(ws_id): """ PATCH /api/public_workspaces/ - Body JSON: { "name": "", "description": "" } + Body JSON: { "name": "", "description": "", "heroColor": "" } """ info = get_current_user_info() user_id = info["userId"] @@ -215,6 +244,7 @@ def api_update_public_workspace(ws_id): data = request.get_json() or {} ws["name"] = data.get("name", ws.get("name")) ws["description"] = data.get("description", ws.get("description")) + ws["heroColor"] = data.get("heroColor", ws.get("heroColor", "#0078d4")) ws["modifiedDate"] = datetime.utcnow().isoformat() try: @@ -407,7 +437,7 @@ def api_list_public_members(ws_id): # must be member is_member = ( ws["owner"]["userId"] == user_id or - user_id in ws.get("admins", []) or + is_user_in_admins(user_id, ws.get("admins", [])) or any(dm["userId"] == user_id for dm in ws.get("documentManagers", [])) ) if not is_member: @@ -424,15 +454,25 @@ def api_list_public_members(ws_id): "email": ws["owner"].get("email", ""), "role": "Owner" }) - # admins - for aid in ws.get("admins", []): - admin_details = get_user_details_from_graph(aid) - results.append({ - "userId": aid, - "displayName": admin_details["displayName"], - "email": admin_details["email"], - "role": "Admin" - }) + # admins (support both old format ["id"] and new format [{userId, email, displayName}]) + for admin in ws.get("admins", []): + if isinstance(admin, str): + # Old format - fetch from Graph + admin_details = get_user_details_from_graph(admin) + results.append({ + "userId": admin, + "displayName": admin_details["displayName"], + "email": admin_details["email"], + "role": "Admin" + }) + elif isinstance(admin, dict): + # New format - use stored data + results.append({ + "userId": admin.get("userId", ""), + "displayName": admin.get("displayName", ""), + "email": admin.get("email", ""), + "role": "Admin" + }) # doc managers for dm in ws.get("documentManagers", []): results.append({ @@ -496,6 +536,25 @@ def api_add_public_member(ws_id): }) ws["modifiedDate"] = datetime.utcnow().isoformat() cosmos_public_workspaces_container.upsert_item(ws) + + # Send notification to the added member + try: + create_notification( + user_id=new_id, + notification_type='public_workspace_membership_change', + title='Added to Public Workspace', + message=f"You have been added to the public workspace '{ws.get('name', 'Unknown')}' as Document Manager.", + link_url=f"/manage_public_workspace?workspace_id={ws_id}", + metadata={ + 'workspace_id': ws_id, + 'workspace_name': ws.get('name', 'Unknown'), + 'role': 'DocumentManager', + 'added_by': info.get('email', 'Unknown') + } + ) + except Exception as notif_error: + debug_print(f"Failed to create notification for new member: {notif_error}") + return jsonify({"message": "Member added"}), 200 @app.route("/api/public_workspaces//members/", methods=["DELETE"]) @@ -523,15 +582,14 @@ def api_remove_public_member(ws_id, member_id): # only Owner/Admin can remove others role = ( "Owner" if ws["owner"]["userId"] == user_id else - "Admin" if user_id in ws.get("admins", []) else + "Admin" if is_user_in_admins(user_id, ws.get("admins", [])) else None ) if role not in ["Owner", "Admin"]: return jsonify({"error": "Forbidden"}), 403 # remove from admins if present - if member_id in ws.get("admins", []): - ws["admins"].remove(member_id) + ws["admins"] = remove_user_from_admins(member_id, ws.get("admins", [])) # remove from doc managers ws["documentManagers"] = [ dm for dm in ws.get("documentManagers", []) @@ -539,7 +597,7 @@ def api_remove_public_member(ws_id, member_id): ] ws["modifiedDate"] = datetime.utcnow().isoformat() cosmos_public_workspaces_container.upsert_item(ws) - return jsonify({"message": "Removed"}), 200 + return jsonify({"success": True, "message": "Removed"}), 200 @app.route("/api/public_workspaces//members/", methods=["PATCH"]) @swagger_route(security=get_auth_security()) @@ -563,22 +621,50 @@ def api_update_public_member_role(ws_id, member_id): role = ( "Owner" if ws["owner"]["userId"] == user_id else - "Admin" if user_id in ws.get("admins", []) else + "Admin" if is_user_in_admins(user_id, ws.get("admins", [])) else None ) if role not in ["Owner", "Admin"]: return jsonify({"error": "Forbidden"}), 403 + # Get member details (from documentManagers or Graph API) + member_name = "" + member_email = "" + for dm in ws.get("documentManagers", []): + if dm.get("userId") == member_id: + member_name = dm.get("displayName", "") + member_email = dm.get("email", "") + break + + # If not found in documentManagers, try to get from existing admins or Graph + if not member_name: + for admin in ws.get("admins", []): + if isinstance(admin, dict) and admin.get("userId") == member_id: + member_name = admin.get("displayName", "") + member_email = admin.get("email", "") + break + if not member_name: + # Fetch from Graph API + try: + details = get_user_details_from_graph(member_id) + member_name = details.get("displayName", "") + member_email = details.get("email", "") + except: + pass + # clear any existing - if member_id in ws.get("admins", []): - ws["admins"].remove(member_id) + ws["admins"] = remove_user_from_admins(member_id, ws.get("admins", [])) ws["documentManagers"] = [ dm for dm in ws.get("documentManagers", []) if dm["userId"] != member_id ] if new_role == "Admin": - ws.setdefault("admins", []).append(member_id) + ws.setdefault("admins", []).append({ + "userId": member_id, + "displayName": member_name, + "email": member_email + }) elif new_role == "DocumentManager": # need displayName/email from pending or empty ws.setdefault("documentManagers", []).append({ @@ -591,7 +677,37 @@ def api_update_public_member_role(ws_id, member_id): ws["modifiedDate"] = datetime.utcnow().isoformat() cosmos_public_workspaces_container.upsert_item(ws) - return jsonify({"message": "Role updated"}), 200 + + # Send notification to the member whose role changed + try: + # Determine old role for notification + old_role = "DocumentManager" # Default, will be corrected if needed + for admin in ws.get("admins", []): + if isinstance(admin, dict) and admin.get("userId") == member_id: + old_role = "Admin" + break + elif isinstance(admin, str) and admin == member_id: + old_role = "Admin" + break + + create_notification( + user_id=member_id, + notification_type='public_workspace_membership_change', + title='Workspace Role Changed', + message=f"Your role in the public workspace '{ws.get('name', 'Unknown')}' has been changed to {new_role}.", + link_url=f"/manage_public_workspace?workspace_id={ws_id}", + metadata={ + 'workspace_id': ws_id, + 'workspace_name': ws.get('name', 'Unknown'), + 'old_role': old_role, + 'new_role': new_role, + 'changed_by': info.get('email', 'Unknown') + } + ) + except Exception as notif_error: + debug_print(f"Failed to create notification for role change: {notif_error}") + + return jsonify({"success": True, "message": "Role updated"}), 200 @app.route("/api/public_workspaces//transferOwnership", methods=["PATCH"]) @swagger_route(security=get_auth_security()) @@ -710,3 +826,292 @@ def api_public_prompt_count(ws_id): ) prompt_count = next(count_iter, 0) return jsonify({"promptCount": prompt_count}), 200 + + @app.route("/api/public_workspaces//stats", methods=["GET"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + @enabled_required("enable_public_workspaces") + def api_public_workspace_stats(ws_id): + """ + GET /api/public_workspaces//stats + Returns statistics for the workspace including documents, storage, tokens, and members. + """ + info = get_current_user_info() + user_id = info["userId"] + + ws = find_public_workspace_by_id(ws_id) + if not ws: + return jsonify({"error": "Not found"}), 404 + + # Check user has access - must be member + is_member = ( + ws["owner"]["userId"] == user_id or + is_user_in_admins(user_id, ws.get("admins", [])) or + any(dm["userId"] == user_id for dm in ws.get("documentManagers", [])) + ) + if not is_member: + return jsonify({"error": "Forbidden"}), 403 + + # Get metrics from workspace record (pre-calculated) + metrics = ws.get("metrics", {}) + document_metrics = metrics.get("document_metrics", {}) + + total_documents = document_metrics.get("total_documents", 0) + storage_used = document_metrics.get("storage_account_size", 0) + + # Get member count + owner = ws.get("owner", {}) + admins = ws.get("admins", []) + doc_managers = ws.get("documentManagers", []) + total_members = 1 + len(admins) + len(doc_managers) + + # Get token usage from activity logs (last 30 days) + from datetime import datetime, timedelta + thirty_days_ago = (datetime.utcnow() - timedelta(days=30)).isoformat() + + debug_print(f"[PUBLIC_WORKSPACE_STATS] Workspace ID: {ws_id}") + debug_print(f"[PUBLIC_WORKSPACE_STATS] Start date: {thirty_days_ago}") + + token_query = """ + SELECT a.usage + FROM a + WHERE a.workspace_context.public_workspace_id = @wsId + AND a.timestamp >= @startDate + AND a.activity_type = 'token_usage' + """ + token_params = [ + {"name": "@wsId", "value": ws_id}, + {"name": "@startDate", "value": thirty_days_ago} + ] + + total_tokens = 0 + try: + token_iter = cosmos_activity_logs_container.query_items( + query=token_query, + parameters=token_params, + enable_cross_partition_query=True + ) + for item in token_iter: + usage = item.get("usage", {}) + total_tokens += usage.get("total_tokens", 0) + debug_print(f"[PUBLIC_WORKSPACE_STATS] Total tokens accumulated: {total_tokens}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_STATS] Error querying total tokens: {e}") + import traceback + traceback.print_exc() + + # Get activity data for charts (last 30 days) + doc_activity_labels = [] + doc_upload_data = [] + doc_delete_data = [] + token_usage_labels = [] + token_usage_data = [] + + # Generate labels for last 30 days + for i in range(29, -1, -1): + date = datetime.utcnow() - timedelta(days=i) + doc_activity_labels.append(date.strftime("%m/%d")) + token_usage_labels.append(date.strftime("%m/%d")) + doc_upload_data.append(0) + doc_delete_data.append(0) + token_usage_data.append(0) + + # Get document upload activity by day + doc_upload_query = """ + SELECT a.timestamp, a.created_at + FROM a + WHERE a.workspace_context.public_workspace_id = @wsId + AND a.timestamp >= @startDate + AND a.activity_type = 'document_creation' + """ + debug_print(f"[PUBLIC_WORKSPACE_STATS] Document upload query: {doc_upload_query}") + debug_print(f"[PUBLIC_WORKSPACE_STATS] Query params: {token_params}") + try: + activity_iter = cosmos_activity_logs_container.query_items( + query=doc_upload_query, + parameters=token_params, + enable_cross_partition_query=True + ) + upload_results = list(activity_iter) + debug_print(f"[PUBLIC_WORKSPACE_STATS] Document upload results count: {len(upload_results)}") + + for item in upload_results: + timestamp = item.get("timestamp") or item.get("created_at") + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + day_date = dt.strftime("%m/%d") + if day_date in doc_activity_labels: + idx = doc_activity_labels.index(day_date) + doc_upload_data[idx] += 1 + debug_print(f"[PUBLIC_WORKSPACE_STATS] Added upload for {day_date}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_STATS] Error parsing timestamp {timestamp}: {e}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_STATS] Error querying document uploads: {e}") + import traceback + traceback.print_exc() + + # Get document delete activity by day + doc_delete_query = """ + SELECT a.timestamp, a.created_at + FROM a + WHERE a.workspace_context.public_workspace_id = @wsId + AND a.timestamp >= @startDate + AND a.activity_type = 'document_deletion' + """ + debug_print(f"[PUBLIC_WORKSPACE_STATS] Document delete query: {doc_delete_query}") + try: + delete_iter = cosmos_activity_logs_container.query_items( + query=doc_delete_query, + parameters=token_params, + enable_cross_partition_query=True + ) + delete_results = list(delete_iter) + debug_print(f"[PUBLIC_WORKSPACE_STATS] Document delete results count: {len(delete_results)}") + + for item in delete_results: + timestamp = item.get("timestamp") or item.get("created_at") + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + day_date = dt.strftime("%m/%d") + if day_date in doc_activity_labels: + idx = doc_activity_labels.index(day_date) + doc_delete_data[idx] += 1 + debug_print(f"[PUBLIC_WORKSPACE_STATS] Added delete for {day_date}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_STATS] Error parsing timestamp {timestamp}: {e}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_STATS] Error querying document deletes: {e}") + import traceback + traceback.print_exc() + + # Get token usage by day + token_activity_query = """ + SELECT a.timestamp, a.created_at, a.usage + FROM a + WHERE a.workspace_context.public_workspace_id = @wsId + AND a.timestamp >= @startDate + AND a.activity_type = 'token_usage' + """ + debug_print(f"[PUBLIC_WORKSPACE_STATS] Token usage query: {token_activity_query}") + try: + token_activity_iter = cosmos_activity_logs_container.query_items( + query=token_activity_query, + parameters=token_params, + enable_cross_partition_query=True + ) + token_results = list(token_activity_iter) + debug_print(f"[PUBLIC_WORKSPACE_STATS] Token usage results count: {len(token_results)}") + + for item in token_results: + timestamp = item.get("timestamp") or item.get("created_at") + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + day_date = dt.strftime("%m/%d") + if day_date in token_usage_labels: + idx = token_usage_labels.index(day_date) + usage = item.get("usage", {}) + tokens = usage.get("total_tokens", 0) + token_usage_data[idx] += tokens + debug_print(f"[PUBLIC_WORKSPACE_STATS] Added {tokens} tokens for {day_date}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_STATS] Error parsing timestamp {timestamp}: {e}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_STATS] Error querying token usage: {e}") + import traceback + traceback.print_exc() + + # Get separate storage metrics + ai_search_size = document_metrics.get("ai_search_size", 0) + storage_account_size = document_metrics.get("storage_account_size", 0) + + stats = { + "totalDocuments": total_documents, + "storageUsed": storage_used, + "storageLimit": 10737418240, # 10GB default + "totalTokens": total_tokens, + "totalMembers": total_members, + "storage": { + "ai_search_size": ai_search_size, + "storage_account_size": storage_account_size + }, + "documentActivity": { + "labels": doc_activity_labels, + "uploads": doc_upload_data, + "deletes": doc_delete_data + }, + "tokenUsage": { + "labels": token_usage_labels, + "data": token_usage_data + } + } + + debug_print(f"[PUBLIC_WORKSPACE_STATS] Final stats: {stats}") + + return jsonify(stats), 200 + + @app.route("/api/public_workspaces//activity", methods=["GET"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + @enabled_required("enable_public_workspaces") + def api_public_workspace_activity(ws_id): + """ + GET /api/public_workspaces//activity + Returns recent activity timeline for the workspace. + Only accessible by owner and admins. + """ + info = get_current_user_info() + user_id = info["userId"] + + ws = find_public_workspace_by_id(ws_id) + if not ws: + return jsonify({"error": "Not found"}), 404 + + # Check user is owner or admin (NOT document managers or regular members) + is_owner = ws["owner"]["userId"] == user_id + is_admin = is_user_in_admins(user_id, ws.get("admins", [])) + + if not (is_owner or is_admin): + return jsonify({"error": "Forbidden - Only workspace owners and admins can view activity timeline"}), 403 + + # Get pagination parameters + limit = request.args.get('limit', 50, type=int) + if limit not in [10, 20, 50]: + limit = 50 + + # Get recent activity + query = f""" + SELECT TOP {limit} * + FROM a + WHERE a.workspace_context.public_workspace_id = @wsId + ORDER BY a.timestamp DESC + """ + params = [{"name": "@wsId", "value": ws_id}] + + debug_print(f"[PUBLIC_WORKSPACE_ACTIVITY] Workspace ID: {ws_id}") + debug_print(f"[PUBLIC_WORKSPACE_ACTIVITY] Query: {query}") + debug_print(f"[PUBLIC_WORKSPACE_ACTIVITY] Params: {params}") + + activities = [] + try: + activity_iter = cosmos_activity_logs_container.query_items( + query=query, + parameters=params, + enable_cross_partition_query=True + ) + activities = list(activity_iter) + debug_print(f"[PUBLIC_WORKSPACE_ACTIVITY] Found {len(activities)} activity records") + if activities: + debug_print(f"[PUBLIC_WORKSPACE_ACTIVITY] Sample activity: {activities[0] if activities else 'None'}") + except Exception as e: + debug_print(f"[PUBLIC_WORKSPACE_ACTIVITY] Error querying activities: {e}") + import traceback + traceback.print_exc() + + return jsonify(activities), 200 + diff --git a/application/single_app/route_backend_retention_policy.py b/application/single_app/route_backend_retention_policy.py new file mode 100644 index 00000000..70d5cc76 --- /dev/null +++ b/application/single_app/route_backend_retention_policy.py @@ -0,0 +1,466 @@ +# route_backend_retention_policy.py + +from config import * +from functions_authentication import * +from functions_settings import * +from functions_retention_policy import execute_retention_policy +from swagger_wrapper import swagger_route, get_auth_security +from functions_debug import debug_print + + +def register_route_backend_retention_policy(app): + + @app.route('/api/admin/retention-policy/settings', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @admin_required + def get_retention_policy_settings(): + """ + Get current retention policy settings and status. + """ + try: + settings = get_settings() + + return jsonify({ + 'success': True, + 'settings': { + 'enable_retention_policy_personal': settings.get('enable_retention_policy_personal', False), + 'enable_retention_policy_group': settings.get('enable_retention_policy_group', False), + 'enable_retention_policy_public': settings.get('enable_retention_policy_public', False), + 'retention_policy_execution_hour': settings.get('retention_policy_execution_hour', 2), + 'retention_policy_last_run': settings.get('retention_policy_last_run'), + 'retention_policy_next_run': settings.get('retention_policy_next_run'), + 'retention_conversation_min_days': settings.get('retention_conversation_min_days', 1), + 'retention_conversation_max_days': settings.get('retention_conversation_max_days', 3650), + 'retention_document_min_days': settings.get('retention_document_min_days', 1), + 'retention_document_max_days': settings.get('retention_document_max_days', 3650) + } + }) + + except Exception as e: + debug_print(f"Error fetching retention policy settings: {e}") + log_event(f"Fetching retention policy settings failed: {e}", level=logging.ERROR) + return jsonify({ + 'success': False, + 'error': 'Failed to fetch retention policy settings' + }), 500 + + + @app.route('/api/admin/retention-policy/settings', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @admin_required + def update_retention_policy_settings(): + """ + Update retention policy admin settings. + + Body: + enable_retention_policy_personal (bool): Enable for personal workspaces + enable_retention_policy_group (bool): Enable for group workspaces + enable_retention_policy_public (bool): Enable for public workspaces + retention_policy_execution_hour (int): Hour of day to execute (0-23) + """ + try: + data = request.get_json() + settings = get_settings() + + # Update settings if provided + if 'enable_retention_policy_personal' in data: + settings['enable_retention_policy_personal'] = bool(data['enable_retention_policy_personal']) + + if 'enable_retention_policy_group' in data: + settings['enable_retention_policy_group'] = bool(data['enable_retention_policy_group']) + + if 'enable_retention_policy_public' in data: + settings['enable_retention_policy_public'] = bool(data['enable_retention_policy_public']) + + if 'retention_policy_execution_hour' in data: + hour = int(data['retention_policy_execution_hour']) + if 0 <= hour <= 23: + settings['retention_policy_execution_hour'] = hour + + # Recalculate next run time + next_run = datetime.now(timezone.utc).replace(hour=hour, minute=0, second=0, microsecond=0) + if next_run <= datetime.now(timezone.utc): + next_run += timedelta(days=1) + settings['retention_policy_next_run'] = next_run.isoformat() + else: + return jsonify({ + 'success': False, + 'error': 'Execution hour must be between 0 and 23' + }), 400 + + update_settings(settings) + + return jsonify({ + 'success': True, + 'message': 'Retention policy settings updated successfully' + }) + + except Exception as e: + debug_print(f"Error updating retention policy settings: {e}") + log_event(f"Retention policy settings update failed: {e}", level=logging.ERROR) + return jsonify({ + 'success': False, + 'error': 'Failed to update retention policy settings' + }), 500 + + + @app.route('/api/admin/retention-policy/execute', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @admin_required + def manual_execute_retention_policy(): + """ + Manually execute retention policy for selected workspace scopes. + + Body: + scopes (list): List of workspace types to process: 'personal', 'group', 'public' + """ + try: + data = request.get_json() + scopes = data.get('scopes', []) + + if not scopes: + return jsonify({ + 'success': False, + 'error': 'No workspace scopes provided' + }), 400 + + # Validate scopes + valid_scopes = ['personal', 'group', 'public'] + invalid_scopes = [s for s in scopes if s not in valid_scopes] + if invalid_scopes: + return jsonify({ + 'success': False, + 'error': f'Invalid workspace scopes: {", ".join(invalid_scopes)}' + }), 400 + + # Execute retention policy for selected scopes + debug_print(f"Manual execution of retention policy for scopes: {scopes}") + results = execute_retention_policy(workspace_scopes=scopes, manual_execution=True) + + return jsonify({ + 'success': results.get('success', False), + 'message': 'Retention policy executed successfully' if results.get('success') else 'Retention policy execution failed', + 'results': results + }) + + except Exception as e: + debug_print(f"Error executing retention policy manually: {e}") + log_event(f"Manual retention policy execution failed: {e}", level=logging.ERROR) + return jsonify({ + 'success': False, + 'error': f'Failed to execute retention policy: {str(e)}' + }), 500 + + + @app.route('/api/retention-policy/user', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def update_user_retention_settings(): + """ + Update retention policy settings for the current user's personal workspace. + + Body: + conversation_retention_days (str|int): Number of days or 'none' + document_retention_days (str|int): Number of days or 'none' + """ + try: + user_id = get_current_user_id() + data = request.get_json() + + retention_settings = {} + + # Validate and parse conversation retention + if 'conversation_retention_days' in data: + conv_retention = data['conversation_retention_days'] + if conv_retention == 'none' or conv_retention is None: + retention_settings['conversation_retention_days'] = 'none' + else: + try: + days = int(conv_retention) + settings = get_settings() + min_days = settings.get('retention_conversation_min_days', 1) + max_days = settings.get('retention_conversation_max_days', 3650) + + if days < min_days or days > max_days: + return jsonify({ + 'success': False, + 'error': f'Conversation retention must be between {min_days} and {max_days} days' + }), 400 + + retention_settings['conversation_retention_days'] = days + except ValueError: + return jsonify({ + 'success': False, + 'error': 'Invalid conversation retention value' + }), 400 + + # Validate and parse document retention + if 'document_retention_days' in data: + doc_retention = data['document_retention_days'] + if doc_retention == 'none' or doc_retention is None: + retention_settings['document_retention_days'] = 'none' + else: + try: + days = int(doc_retention) + settings = get_settings() + min_days = settings.get('retention_document_min_days', 1) + max_days = settings.get('retention_document_max_days', 3650) + + if days < min_days or days > max_days: + return jsonify({ + 'success': False, + 'error': f'Document retention must be between {min_days} and {max_days} days' + }), 400 + + retention_settings['document_retention_days'] = days + except ValueError: + return jsonify({ + 'success': False, + 'error': 'Invalid document retention value' + }), 400 + + if not retention_settings: + return jsonify({ + 'success': False, + 'error': 'No retention settings provided' + }), 400 + + # Update user settings + update_user_settings(user_id, {'retention_policy': retention_settings}) + + return jsonify({ + 'success': True, + 'message': 'Retention settings updated successfully' + }) + + except Exception as e: + debug_print(f"Error updating user retention settings: {e}") + log_event(f"User retention settings update failed: {e}", level=logging.ERROR) + return jsonify({ + 'success': False, + 'error': 'Failed to update retention settings' + }), 500 + + + @app.route('/api/retention-policy/group/', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def update_group_retention_settings(group_id): + """ + Update retention policy settings for a group workspace. + User must be owner or admin of the group. + + Body: + conversation_retention_days (str|int): Number of days or 'none' + document_retention_days (str|int): Number of days or 'none' + """ + try: + user_id = get_current_user_id() + data = request.get_json() + + # Get group and verify permissions + from functions_group import find_group_by_id, get_user_role_in_group + group = find_group_by_id(group_id) + + if not group: + return jsonify({ + 'success': False, + 'error': 'Group not found' + }), 404 + + user_role = get_user_role_in_group(group, user_id) + if user_role not in ['Owner', 'Admin']: + return jsonify({ + 'success': False, + 'error': 'Insufficient permissions. Must be group owner or admin.' + }), 403 + + retention_settings = {} + + # Validate and parse conversation retention + if 'conversation_retention_days' in data: + conv_retention = data['conversation_retention_days'] + if conv_retention == 'none' or conv_retention is None: + retention_settings['conversation_retention_days'] = 'none' + else: + try: + days = int(conv_retention) + settings = get_settings() + min_days = settings.get('retention_conversation_min_days', 1) + max_days = settings.get('retention_conversation_max_days', 3650) + + if days < min_days or days > max_days: + return jsonify({ + 'success': False, + 'error': f'Conversation retention must be between {min_days} and {max_days} days' + }), 400 + + retention_settings['conversation_retention_days'] = days + except ValueError: + return jsonify({ + 'success': False, + 'error': 'Invalid conversation retention value' + }), 400 + + # Validate and parse document retention + if 'document_retention_days' in data: + doc_retention = data['document_retention_days'] + if doc_retention == 'none' or doc_retention is None: + retention_settings['document_retention_days'] = 'none' + else: + try: + days = int(doc_retention) + settings = get_settings() + min_days = settings.get('retention_document_min_days', 1) + max_days = settings.get('retention_document_max_days', 3650) + + if days < min_days or days > max_days: + return jsonify({ + 'success': False, + 'error': f'Document retention must be between {min_days} and {max_days} days' + }), 400 + + retention_settings['document_retention_days'] = days + except ValueError: + return jsonify({ + 'success': False, + 'error': 'Invalid document retention value' + }), 400 + + if not retention_settings: + return jsonify({ + 'success': False, + 'error': 'No retention settings provided' + }), 400 + + # Update group document + group['retention_policy'] = retention_settings + cosmos_groups_container.upsert_item(group) + + return jsonify({ + 'success': True, + 'message': 'Group retention settings updated successfully' + }) + + except Exception as e: + debug_print(f"Error updating group retention settings: {e}") + log_event(f"Group retention settings update failed: {e}", level=logging.ERROR) + return jsonify({ + 'success': False, + 'error': 'Failed to update retention settings' + }), 500 + + + @app.route('/api/retention-policy/public/', methods=['POST']) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def update_public_workspace_retention_settings(public_workspace_id): + """ + Update retention policy settings for a public workspace. + User must be owner or admin of the workspace. + + Body: + conversation_retention_days (str|int): Number of days or 'none' + document_retention_days (str|int): Number of days or 'none' + """ + try: + user_id = get_current_user_id() + data = request.get_json() + + # Get workspace and verify permissions + from functions_public_workspaces import find_public_workspace_by_id, get_user_role_in_public_workspace + workspace = find_public_workspace_by_id(public_workspace_id) + + if not workspace: + return jsonify({ + 'success': False, + 'error': 'Public workspace not found' + }), 404 + + user_role = get_user_role_in_public_workspace(workspace, user_id) + if user_role not in ['Owner', 'Admin']: + return jsonify({ + 'success': False, + 'error': 'Insufficient permissions. Must be workspace owner or admin.' + }), 403 + + retention_settings = {} + + # Validate and parse conversation retention + if 'conversation_retention_days' in data: + conv_retention = data['conversation_retention_days'] + if conv_retention == 'none' or conv_retention is None: + retention_settings['conversation_retention_days'] = 'none' + else: + try: + days = int(conv_retention) + settings = get_settings() + min_days = settings.get('retention_conversation_min_days', 1) + max_days = settings.get('retention_conversation_max_days', 3650) + + if days < min_days or days > max_days: + return jsonify({ + 'success': False, + 'error': f'Conversation retention must be between {min_days} and {max_days} days' + }), 400 + + retention_settings['conversation_retention_days'] = days + except ValueError: + return jsonify({ + 'success': False, + 'error': 'Invalid conversation retention value' + }), 400 + + # Validate and parse document retention + if 'document_retention_days' in data: + doc_retention = data['document_retention_days'] + if doc_retention == 'none' or doc_retention is None: + retention_settings['document_retention_days'] = 'none' + else: + try: + days = int(doc_retention) + settings = get_settings() + min_days = settings.get('retention_document_min_days', 1) + max_days = settings.get('retention_document_max_days', 3650) + + if days < min_days or days > max_days: + return jsonify({ + 'success': False, + 'error': f'Document retention must be between {min_days} and {max_days} days' + }), 400 + + retention_settings['document_retention_days'] = days + except ValueError: + return jsonify({ + 'success': False, + 'error': 'Invalid document retention value' + }), 400 + + if not retention_settings: + return jsonify({ + 'success': False, + 'error': 'No retention settings provided' + }), 400 + + # Update workspace document + workspace['retention_policy'] = retention_settings + cosmos_public_workspaces_container.upsert_item(workspace) + + return jsonify({ + 'success': True, + 'message': 'Public workspace retention settings updated successfully' + }) + + except Exception as e: + debug_print(f"Error updating public workspace retention settings: {e}") + log_event(f"Public workspace retention settings update failed: {e}", level=logging.ERROR) + return jsonify({ + 'success': False, + 'error': 'Failed to update retention settings' + }), 500 diff --git a/application/single_app/route_backend_safety.py b/application/single_app/route_backend_safety.py index 350f4a86..73eb6e56 100644 --- a/application/single_app/route_backend_safety.py +++ b/application/single_app/route_backend_safety.py @@ -9,7 +9,7 @@ def register_route_backend_safety(app): @app.route('/api/safety/logs', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required + @safety_violation_admin_required @enabled_required("enable_content_safety") def get_safety_logs(): """ @@ -96,7 +96,7 @@ def get_safety_logs(): @app.route('/api/safety/logs/', methods=['PATCH']) @swagger_route(security=get_auth_security()) @login_required - @admin_required + @safety_violation_admin_required @enabled_required("enable_content_safety") def update_safety_log(log_id): """ diff --git a/application/single_app/route_backend_settings.py b/application/single_app/route_backend_settings.py index 6855ea65..be182e93 100644 --- a/application/single_app/route_backend_settings.py +++ b/application/single_app/route_backend_settings.py @@ -279,6 +279,9 @@ def test_connection(): elif test_type == 'azure_doc_intelligence': return _test_azure_doc_intelligence_connection(data) + elif test_type == 'multimodal_vision': + return _test_multimodal_vision_connection(data) + elif test_type == 'chunking_api': # If you have a chunking API test, implement it here. return jsonify({'message': 'Chunking API connection successful'}), 200 @@ -294,6 +297,86 @@ def test_connection(): except Exception as e: return jsonify({'error': str(e)}), 500 + +def _test_multimodal_vision_connection(payload): + """Test multi-modal vision analysis with a sample image.""" + enable_apim = payload.get('enable_apim', False) + vision_model = payload.get('vision_model') + + if not vision_model: + return jsonify({'error': 'No vision model specified'}), 400 + + # Create a simple test image (1x1 red pixel PNG) + test_image_base64 = "iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mP8/5+hHgAHggJ/PchI7wAAAABJRU5ErkJggg==" + + try: + if enable_apim: + apim_data = payload.get('apim', {}) + endpoint = apim_data.get('endpoint') + api_version = apim_data.get('api_version') + subscription_key = apim_data.get('subscription_key') + + gpt_client = AzureOpenAI( + api_version=api_version, + azure_endpoint=endpoint, + api_key=subscription_key + ) + else: + direct_data = payload.get('direct', {}) + endpoint = direct_data.get('endpoint') + api_version = direct_data.get('api_version') + auth_type = direct_data.get('auth_type', 'key') + + if auth_type == 'managed_identity': + token_provider = get_bearer_token_provider( + DefaultAzureCredential(), + cognitive_services_scope + ) + gpt_client = AzureOpenAI( + api_version=api_version, + azure_endpoint=endpoint, + azure_ad_token_provider=token_provider + ) + else: + api_key = direct_data.get('key') + gpt_client = AzureOpenAI( + api_version=api_version, + azure_endpoint=endpoint, + api_key=api_key + ) + + # Test vision analysis with simple prompt + response = gpt_client.chat.completions.create( + model=vision_model, + messages=[ + { + "role": "user", + "content": [ + { + "type": "text", + "text": "What color is this image? Just say the color." + }, + { + "type": "image_url", + "image_url": { + "url": f"data:image/png;base64,{test_image_base64}" + } + } + ] + } + ], + max_tokens=50 + ) + + result = response.choices[0].message.content + + return jsonify({ + 'message': 'Multi-modal vision connection successful', + 'details': f'Model responded: {result}' + }), 200 + + except Exception as e: + return jsonify({'error': f'Vision test failed: {str(e)}'}), 500 def _test_multimodal_vision_connection(payload): """Test multi-modal vision analysis with a sample image.""" diff --git a/application/single_app/route_backend_speech.py b/application/single_app/route_backend_speech.py new file mode 100644 index 00000000..3c559ce2 --- /dev/null +++ b/application/single_app/route_backend_speech.py @@ -0,0 +1,204 @@ +# route_backend_speech.py +""" +Backend routes for speech-to-text functionality. +""" +from config import * +from functions_authentication import login_required, get_current_user_id +from functions_settings import get_settings +from functions_debug import debug_print +import azure.cognitiveservices.speech as speechsdk +import os +import tempfile + +try: + from pydub import AudioSegment + PYDUB_AVAILABLE = True +except ImportError: + PYDUB_AVAILABLE = False + print("Warning: pydub not available. Audio conversion may fail for non-WAV formats.") + +def register_route_backend_speech(app): + """Register speech-to-text routes""" + + @app.route('/api/speech/transcribe-chat', methods=['POST']) + @login_required + def transcribe_chat_audio(): + """ + Transcribe audio from chat speech input. + Expects audio blob in 'audio' field of FormData. + Returns JSON with transcribed text or error. + """ + user_id = get_current_user_id() + + # Get settings + settings = get_settings() + + # Check if speech-to-text chat input is enabled + if not settings.get('enable_speech_to_text_input', False): + return jsonify({ + 'success': False, + 'error': 'Speech-to-text chat input is not enabled' + }), 403 + + # Check if audio file was provided + if 'audio' not in request.files: + return jsonify({ + 'success': False, + 'error': 'No audio file provided' + }), 400 + + audio_file = request.files['audio'] + + if audio_file.filename == '': + return jsonify({ + 'success': False, + 'error': 'Empty audio file' + }), 400 + + print(f"[Debug] Received audio file: {audio_file.filename}") + + # Save audio to temporary WAV file + temp_audio_path = None + + try: + # Create temporary file for uploaded audio (always WAV from frontend) + with tempfile.NamedTemporaryFile(delete=False, suffix='.wav') as temp_audio: + audio_file.save(temp_audio.name) + temp_audio_path = temp_audio.name + + print(f"[Debug] Audio saved to: {temp_audio_path}") + + # Get speech configuration using existing helper + from functions_documents import _get_speech_config + + speech_endpoint = settings.get('speech_service_endpoint', '') + speech_locale = settings.get('speech_service_locale', 'en-US') + + if not speech_endpoint: + return jsonify({ + 'success': False, + 'error': 'Speech service endpoint not configured' + }), 500 + + # Get speech config + speech_config = _get_speech_config(settings, speech_endpoint, speech_locale) + + print("[Debug] Speech config obtained successfully") + + # WAV files can use direct file input + print(f"[Debug] Using WAV file directly: {temp_audio_path}") + audio_config = speechsdk.AudioConfig(filename=temp_audio_path) + + # Create speech recognizer + speech_recognizer = speechsdk.SpeechRecognizer( + speech_config=speech_config, + audio_config=audio_config + ) + + # Get audio file size for debugging + audio_file_size = os.path.getsize(temp_audio_path) + debug_print(f"[Speech] Audio file size: {audio_file_size} bytes") + + try: + debug_print("[Speech] Starting continuous recognition for longer audio...") + + # Use continuous recognition for longer audio files + all_results = [] + done = False + + def handle_recognized(evt): + """Handle recognized speech events""" + if evt.result.reason == speechsdk.ResultReason.RecognizedSpeech: + debug_print(f"[Speech] Recognized: {evt.result.text}") + all_results.append(evt.result.text) + + def handle_canceled(evt): + """Handle cancellation events""" + nonlocal done + debug_print(f"[Speech] Canceled: {evt}") + if evt.reason == speechsdk.CancellationReason.Error: + debug_print(f"[Speech] Error details: {evt.error_details}") + done = True + + def handle_session_stopped(evt): + """Handle session stopped events""" + nonlocal done + debug_print("[Speech] Session stopped") + done = True + + # Connect callbacks + speech_recognizer.recognized.connect(handle_recognized) + speech_recognizer.canceled.connect(handle_canceled) + speech_recognizer.session_stopped.connect(handle_session_stopped) + + # Start continuous recognition + speech_recognizer.start_continuous_recognition() + + # Wait for completion (timeout after 120 seconds) + import time + timeout = 120 + elapsed = 0 + while not done and elapsed < timeout: + time.sleep(0.1) + elapsed += 0.1 + + # Stop recognition + speech_recognizer.stop_continuous_recognition() + + debug_print(f"[Speech] Recognition complete. Recognized {len(all_results)} segments") + + # Combine all recognized text + if all_results: + combined_text = ' '.join(all_results) + debug_print(f"[Speech] Combined text length: {len(combined_text)} characters") + return jsonify({ + 'success': True, + 'text': combined_text + }) + else: + debug_print("[Speech] No speech recognized") + return jsonify({ + 'success': False, + 'error': 'No speech could be recognized' + }) + finally: + # Properly close the recognizer to release file handles + try: + if speech_recognizer: + # Disconnect all callbacks + speech_recognizer.recognized.disconnect_all() + speech_recognizer.canceled.disconnect_all() + speech_recognizer.session_stopped.disconnect_all() + debug_print("[Speech] Disconnected recognizer callbacks") + + # Give the recognizer time to release resources + import time + time.sleep(0.2) + + debug_print("[Speech] Speech recognizer cleanup complete") + except Exception as recognizer_cleanup_error: + print(f"[Debug] Error during recognizer cleanup: {recognizer_cleanup_error}") + + except Exception as e: + print(f"Error transcribing audio: {e}") + import traceback + traceback.print_exc() + return jsonify({ + 'success': False, + 'error': str(e) + }), 500 + + finally: + # Clean up temporary files + if temp_audio_path and os.path.exists(temp_audio_path): + try: + # Longer delay to ensure file handle is fully released on Windows + import time + time.sleep(0.3) + os.remove(temp_audio_path) + print(f"[Debug] Cleaned up temp file: {temp_audio_path}") + except PermissionError as perm_error: + # If still locked, schedule for deletion on next boot or ignore + print(f"[Debug] Temp file still locked, will be cleaned by OS: {temp_audio_path}") + except Exception as cleanup_error: + print(f"[Debug] Error cleaning up temporary files: {cleanup_error}") diff --git a/application/single_app/route_backend_tts.py b/application/single_app/route_backend_tts.py new file mode 100644 index 00000000..11d14cc3 --- /dev/null +++ b/application/single_app/route_backend_tts.py @@ -0,0 +1,238 @@ +# route_backend_tts.py + +from config import * +from functions_authentication import * +from functions_settings import * +from functions_debug import debug_print +from swagger_wrapper import swagger_route, get_auth_security +import azure.cognitiveservices.speech as speechsdk +import io +import time +import random + +def register_route_backend_tts(app): + """ + Text-to-speech API routes using Azure Speech Services + """ + + @app.route("/api/chat/tts", methods=["POST"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def synthesize_speech(): + """ + Synthesize text to speech using Azure Speech Service. + Expects JSON: { + "text": "Text to synthesize", + "voice": "en-US-Andrew:DragonHDLatestNeural", # optional, defaults to Andrew + "speed": 1.0 # optional, 0.5-2.0 range + } + Returns audio/wav stream + """ + try: + debug_print("[TTS] Synthesize speech request received") + + # Get settings + settings = get_settings() + + # Check if TTS is enabled + if not settings.get('enable_text_to_speech', False): + debug_print("[TTS] Text-to-speech is not enabled in settings") + return jsonify({"error": "Text-to-speech is not enabled"}), 403 + + # Validate speech service configuration + speech_key = settings.get('speech_service_key', '') + speech_region = settings.get('speech_service_location', '') + + if not speech_key or not speech_region: + debug_print("[TTS] Speech service not configured - missing key or region") + return jsonify({"error": "Speech service not configured"}), 500 + + debug_print(f"[TTS] Speech service configured - region: {speech_region}") + + # Parse request data + data = request.get_json() + if not data or 'text' not in data: + debug_print("[TTS] Invalid request - missing 'text' field") + return jsonify({"error": "Missing 'text' field in request"}), 400 + + text = data.get('text', '').strip() + if not text: + debug_print("[TTS] Invalid request - text is empty") + return jsonify({"error": "Text cannot be empty"}), 400 + + # Get voice and speed settings + voice = data.get('voice', 'en-US-Andrew:DragonHDLatestNeural') + speed = float(data.get('speed', 1.0)) + + # Clamp speed to valid range + speed = max(0.5, min(2.0, speed)) + + debug_print(f"[TTS] Request params - voice: {voice}, speed: {speed}, text_length: {len(text)}") + + # Configure speech service + speech_config = speechsdk.SpeechConfig( + subscription=speech_key, + region=speech_region + ) + speech_config.speech_synthesis_voice_name = voice + + # Set output format to high quality + speech_config.set_speech_synthesis_output_format( + speechsdk.SpeechSynthesisOutputFormat.Audio48Khz192KBitRateMonoMp3 + ) + + # Create synthesizer with no audio output config (returns audio data in result) + speech_synthesizer = speechsdk.SpeechSynthesizer( + speech_config=speech_config, + audio_config=None + ) + + # Perform synthesis with retry logic for rate limiting (429 errors) + max_retries = 3 + retry_count = 0 + last_error = None + + while retry_count <= max_retries: + try: + # Build SSML if speed adjustment needed + if speed != 1.0: + debug_print(f"[TTS] Using SSML with speed adjustment: {speed}x (attempt {retry_count + 1}/{max_retries + 1})") + speed_percent = int(speed * 100) + ssml = f""" + + + + {text} + + + + """ + result = speech_synthesizer.speak_ssml_async(ssml).get() + else: + debug_print(f"[TTS] Using plain text synthesis (attempt {retry_count + 1}/{max_retries + 1})") + result = speech_synthesizer.speak_text_async(text).get() + + # Check for rate limiting or capacity issues + if result.reason == speechsdk.ResultReason.Canceled: + cancellation_details = result.cancellation_details + if cancellation_details.reason == speechsdk.CancellationReason.Error: + error_details = cancellation_details.error_details + + # Check if it's a rate limit error (429 or similar) + if "429" in error_details or "rate" in error_details.lower() or "quota" in error_details.lower() or "throttl" in error_details.lower(): + if retry_count < max_retries: + # Randomized delay between 50-800ms with exponential backoff + base_delay = 0.05 + (retry_count * 0.1) # 50ms, 150ms, 250ms base + jitter = random.uniform(0, 0.75) # Up to 750ms jitter + delay = base_delay + jitter + debug_print(f"[TTS] Rate limit detected (429), retrying in {delay*1000:.0f}ms (attempt {retry_count + 1}/{max_retries})") + time.sleep(delay) + retry_count += 1 + last_error = error_details + continue # Retry + else: + debug_print(f"[TTS] ERROR - Rate limit exceeded after {max_retries} retries") + return jsonify({"error": "Service temporarily unavailable due to high load. Please try again."}), 429 + else: + # Other error, don't retry + error_msg = f"Speech synthesis canceled: {cancellation_details.reason} - {error_details}" + debug_print(f"[TTS] ERROR - Synthesis failed: {error_msg}") + return jsonify({"error": error_msg}), 500 + + # Success - break out of retry loop + break + + except Exception as e: + # Network or other transient errors + if retry_count < max_retries and ("timeout" in str(e).lower() or "connection" in str(e).lower()): + delay = 0.05 + (retry_count * 0.1) + random.uniform(0, 0.75) + debug_print(f"[TTS] Transient error, retrying in {delay*1000:.0f}ms: {str(e)}") + log_event(f"TTS transient error, retrying: {str(e)}", level=logging.WARNING) + time.sleep(delay) + retry_count += 1 + last_error = str(e) + continue + else: + raise # Re-raise if not retryable or out of retries + + # Check result after retries + if result.reason == speechsdk.ResultReason.SynthesizingAudioCompleted: + debug_print(f"[TTS] Synthesis completed successfully - audio_size: {len(result.audio_data)} bytes") + if retry_count > 0: + debug_print(f"[TTS] Success after {retry_count} retries") + # Get audio data + audio_data = result.audio_data + + # Return audio stream + return send_file( + io.BytesIO(audio_data), + mimetype='audio/mpeg', + as_attachment=False, + download_name='speech.mp3' + ) + + elif result.reason == speechsdk.ResultReason.Canceled: + cancellation_details = result.cancellation_details + error_msg = f"Speech synthesis canceled: {cancellation_details.reason}" + if cancellation_details.reason == speechsdk.CancellationReason.Error: + error_msg += f" - {cancellation_details.error_details}" + debug_print(f"[TTS] ERROR - Synthesis failed: {error_msg}") + print(f"[ERROR] TTS synthesis failed: {error_msg}") + return jsonify({"error": error_msg}), 500 + else: + debug_print(f"[TTS] ERROR - Unknown synthesis error, reason: {result.reason}") + return jsonify({"error": "Unknown synthesis error"}), 500 + + except ValueError as e: + debug_print(f"[TTS] ERROR - Invalid parameter: {str(e)}") + return jsonify({"error": f"Invalid parameter: {str(e)}"}), 400 + except Exception as e: + debug_print(f"[TTS] ERROR - Exception: {str(e)}") + log_event(f"TTS synthesis failed: {str(e)}", level=logging.ERROR) + print(f"[ERROR] TTS synthesis exception: {str(e)}") + import traceback + traceback.print_exc() + return jsonify({"error": f"TTS synthesis failed: {str(e)}"}), 500 + + @app.route("/api/chat/tts/voices", methods=["GET"]) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def get_available_voices(): + """ + Returns list of available DragonHD voices for TTS + """ + debug_print("[TTS] Get available voices request received") + voices = [ + {"name": "de-DE-Florian:DragonHDLatestNeural", "gender": "Male", "language": "German", "status": "GA"}, + {"name": "de-DE-Seraphina:DragonHDLatestNeural", "gender": "Female", "language": "German", "status": "GA"}, + {"name": "en-US-Adam:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "GA"}, + {"name": "en-US-Alloy:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "Preview"}, + {"name": "en-US-Andrew:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "GA", "note": ""}, + {"name": "en-US-Andrew2:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "GA", "note": "Optimized for conversational content"}, + {"name": "en-US-Andrew3:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "Preview", "note": "Optimized for podcast content"}, + {"name": "en-US-Aria:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "Preview"}, + {"name": "en-US-Ava:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "GA"}, + {"name": "en-US-Ava3:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "Preview", "note": "Optimized for podcast content"}, + {"name": "en-US-Brian:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "GA"}, + {"name": "en-US-Davis:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "GA"}, + {"name": "en-US-Emma:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "GA"}, + {"name": "en-US-Emma2:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "GA", "note": "Optimized for conversational content"}, + {"name": "en-US-Jenny:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "Preview"}, + {"name": "en-US-MultiTalker-Ava-Andrew:DragonHDLatestNeural", "gender": "Multi", "language": "English (US)", "status": "Preview", "note": "Multiple speakers"}, + {"name": "en-US-Nova:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "Preview"}, + {"name": "en-US-Phoebe:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "Preview"}, + {"name": "en-US-Serena:DragonHDLatestNeural", "gender": "Female", "language": "English (US)", "status": "Preview"}, + {"name": "en-US-Steffan:DragonHDLatestNeural", "gender": "Male", "language": "English (US)", "status": "GA"}, + {"name": "es-ES-Tristan:DragonHDLatestNeural", "gender": "Male", "language": "Spanish (Spain)", "status": "GA"}, + {"name": "es-ES-Ximena:DragonHDLatestNeural", "gender": "Female", "language": "Spanish (Spain)", "status": "GA"}, + {"name": "fr-FR-Remy:DragonHDLatestNeural", "gender": "Male", "language": "French", "status": "GA"}, + {"name": "fr-FR-Vivienne:DragonHDLatestNeural", "gender": "Female", "language": "French", "status": "GA"}, + {"name": "ja-JP-Masaru:DragonHDLatestNeural", "gender": "Male", "language": "Japanese", "status": "GA"}, + {"name": "ja-JP-Nanami:DragonHDLatestNeural", "gender": "Female", "language": "Japanese", "status": "GA"}, + {"name": "zh-CN-Xiaochen:DragonHDLatestNeural", "gender": "Female", "language": "Chinese (Simplified)", "status": "GA"}, + {"name": "zh-CN-Yunfan:DragonHDLatestNeural", "gender": "Male", "language": "Chinese (Simplified)", "status": "GA"} + ] + + return jsonify({"voices": voices}), 200 diff --git a/application/single_app/route_backend_users.py b/application/single_app/route_backend_users.py index 0ee7cc13..d2ca52f8 100644 --- a/application/single_app/route_backend_users.py +++ b/application/single_app/route_backend_users.py @@ -155,6 +155,10 @@ def user_settings(): 'publicDirectorySavedLists', 'publicDirectorySettings', 'activePublicWorkspaceOid', # Chat UI settings 'navbar_layout', 'chatLayout', 'showChatTitle', 'chatSplitSizes', + # Microphone permission settings + 'microphonePermissionState', + # Text-to-speech settings + 'ttsEnabled', 'ttsVoice', 'ttsSpeed', 'ttsAutoplay', # Metrics and other settings 'metrics', 'lastUpdated' } # Add others as needed diff --git a/application/single_app/route_external_health.py b/application/single_app/route_external_health.py index 4e22decb..ce4508d0 100644 --- a/application/single_app/route_external_health.py +++ b/application/single_app/route_external_health.py @@ -9,6 +9,7 @@ def register_route_external_health(app): @app.route('/external/healthcheck', methods=['GET']) + @swagger_route(security=get_auth_security()) @swagger_route() @enabled_required("enable_external_healthcheck") def external_health_check(): diff --git a/application/single_app/route_external_public_documents.py b/application/single_app/route_external_public_documents.py index b4a3ea7a..d3002d53 100644 --- a/application/single_app/route_external_public_documents.py +++ b/application/single_app/route_external_public_documents.py @@ -282,6 +282,9 @@ def external_patch_public_document(document_id): return jsonify({'error': 'Active public workspace not found'}), 404 data = request.get_json() + + # Track which fields were updated + updated_fields = {} try: if 'title' in data: @@ -291,6 +294,7 @@ def external_patch_public_document(document_id): user_id=user_id, title=data['title'] ) + updated_fields['title'] = data['title'] if 'abstract' in data: update_document( document_id=document_id, @@ -298,6 +302,7 @@ def external_patch_public_document(document_id): user_id=user_id, abstract=data['abstract'] ) + updated_fields['abstract'] = data['abstract'] if 'keywords' in data: if isinstance(data['keywords'], list): update_document( @@ -306,13 +311,16 @@ def external_patch_public_document(document_id): user_id=user_id, keywords=data['keywords'] ) + updated_fields['keywords'] = data['keywords'] else: + keywords_list = [kw.strip() for kw in data['keywords'].split(',')] update_document( document_id=document_id, public_workspace_id=active_workspace_id, user_id=user_id, - keywords=[kw.strip() for kw in data['keywords'].split(',')] + keywords=keywords_list ) + updated_fields['keywords'] = keywords_list if 'publication_date' in data: update_document( document_id=document_id, @@ -320,6 +328,7 @@ def external_patch_public_document(document_id): user_id=user_id, publication_date=data['publication_date'] ) + updated_fields['publication_date'] = data['publication_date'] if 'document_classification' in data: update_document( document_id=document_id, @@ -327,6 +336,7 @@ def external_patch_public_document(document_id): user_id=user_id, document_classification=data['document_classification'] ) + updated_fields['document_classification'] = data['document_classification'] if 'authors' in data: if isinstance(data['authors'], list): update_document( @@ -335,12 +345,31 @@ def external_patch_public_document(document_id): user_id=user_id, authors=data['authors'] ) + updated_fields['authors'] = data['authors'] else: + authors_list = [data['authors']] update_document( document_id=document_id, public_workspace_id=active_workspace_id, user_id=user_id, - authors=[data['authors']] + authors=authors_list + ) + updated_fields['authors'] = authors_list + + # Log the metadata update transaction if any fields were updated + if updated_fields: + from functions_documents import get_document + from functions_activity_logging import log_document_metadata_update_transaction + doc = get_document(user_id, document_id, public_workspace_id=active_workspace_id) + if doc: + log_document_metadata_update_transaction( + user_id=user_id, + document_id=document_id, + workspace_type='public', + file_name=doc.get('file_name', 'Unknown'), + updated_fields=updated_fields, + file_type=doc.get('file_type'), + public_workspace_id=active_workspace_id ) return jsonify({'message': 'Public document metadata updated successfully'}), 200 diff --git a/application/single_app/route_frontend_admin_settings.py b/application/single_app/route_frontend_admin_settings.py index 32498432..8cdf2236 100644 --- a/application/single_app/route_frontend_admin_settings.py +++ b/application/single_app/route_frontend_admin_settings.py @@ -155,6 +155,8 @@ def admin_settings(): settings['classification_banner_text'] = '' if 'classification_banner_color' not in settings: settings['classification_banner_color'] = '#ffc107' # Bootstrap warning color + if 'classification_banner_text_color' not in settings: + settings['classification_banner_text_color'] = '#ffffff' # White text by default # --- Add defaults for key vault if 'enable_key_vault_secret_storage' not in settings: @@ -235,10 +237,16 @@ def admin_settings(): # Get the persisted values for template rendering update_available = settings.get('update_available', False) latest_version = settings.get('latest_version_available') + + # Get user settings for profile and navigation + user_id = get_current_user_id() + user_settings = get_user_settings(user_id) return render_template( 'admin_settings.html', + app_settings=settings, # Admin needs unsanitized settings to view/edit all configuration settings=settings, + user_settings=user_settings, update_available=update_available, latest_version=latest_version, download_url=download_url @@ -268,6 +276,7 @@ def admin_settings(): require_member_of_create_public_workspace = form_data.get('require_member_of_create_public_workspace') == 'on' require_member_of_safety_violation_admin = form_data.get('require_member_of_safety_violation_admin') == 'on' require_member_of_control_center_admin = form_data.get('require_member_of_control_center_admin') == 'on' + require_member_of_control_center_dashboard_reader = form_data.get('require_member_of_control_center_dashboard_reader') == 'on' require_member_of_feedback_admin = form_data.get('require_member_of_feedback_admin') == 'on' # --- Handle Document Classification Toggle --- @@ -377,6 +386,7 @@ def admin_settings(): classification_banner_enabled = form_data.get('classification_banner_enabled') == 'on' classification_banner_text = form_data.get('classification_banner_text', '').strip() classification_banner_color = form_data.get('classification_banner_color', '#ffc107').strip() + classification_banner_text_color = form_data.get('classification_banner_text_color', '#ffffff').strip() # --- Application Insights Logging Toggle --- enable_appinsights_global_logging = form_data.get('enable_appinsights_global_logging') == 'on' @@ -458,6 +468,29 @@ def admin_settings(): else: file_processing_logs_turnoff_time_str = None + # --- Retention Policy Settings --- + enable_retention_policy_personal = form_data.get('enable_retention_policy_personal') == 'on' + enable_retention_policy_group = form_data.get('enable_retention_policy_group') == 'on' + enable_retention_policy_public = form_data.get('enable_retention_policy_public') == 'on' + retention_policy_execution_hour = int(form_data.get('retention_policy_execution_hour', 2)) + + # Validate execution hour (0-23) + if retention_policy_execution_hour < 0 or retention_policy_execution_hour > 23: + retention_policy_execution_hour = 2 # Default to 2 AM + + # Calculate next scheduled execution time if any retention policy is enabled + retention_policy_next_run = None + if enable_retention_policy_personal or enable_retention_policy_group or enable_retention_policy_public: + now = datetime.now(timezone.utc) + # Create next run datetime with the specified hour + next_run = now.replace(hour=retention_policy_execution_hour, minute=0, second=0, microsecond=0) + + # If the scheduled time has already passed today, schedule for tomorrow + if next_run <= now: + next_run = next_run + timedelta(days=1) + + retention_policy_next_run = next_run.isoformat() + # --- Authentication & Redirect Settings --- enable_front_door = form_data.get('enable_front_door') == 'on' front_door_url = form_data.get('front_door_url', '').strip() @@ -570,6 +603,13 @@ def is_valid_url(url): 'file_processing_logs_turnoff_time': file_processing_logs_turnoff_time_str, 'require_member_of_create_group': require_member_of_create_group, 'require_member_of_create_public_workspace': require_member_of_create_public_workspace, + + # Retention Policy + 'enable_retention_policy_personal': enable_retention_policy_personal, + 'enable_retention_policy_group': enable_retention_policy_group, + 'enable_retention_policy_public': enable_retention_policy_public, + 'retention_policy_execution_hour': retention_policy_execution_hour, + 'retention_policy_next_run': retention_policy_next_run, # Multimedia & Metadata 'enable_video_file_support': enable_video_file_support, @@ -670,6 +710,12 @@ def is_valid_url(url): 'speech_service_locale': form_data.get('speech_service_locale', '').strip(), 'speech_service_authentication_type': form_data.get('speech_service_authentication_type', 'key'), 'speech_service_key': form_data.get('speech_service_key', '').strip(), + + # Speech-to-text chat input + 'enable_speech_to_text_input': form_data.get('enable_speech_to_text_input') == 'on', + + # Text-to-speech chat output + 'enable_text_to_speech': form_data.get('enable_text_to_speech') == 'on', 'metadata_extraction_model': form_data.get('metadata_extraction_model', '').strip(), @@ -681,8 +727,10 @@ def is_valid_url(url): 'classification_banner_enabled': classification_banner_enabled, 'classification_banner_text': classification_banner_text, 'classification_banner_color': classification_banner_color, + 'classification_banner_text_color': classification_banner_text_color, - 'require_member_of_control_center_admin': require_member_of_control_center_admin + 'require_member_of_control_center_admin': require_member_of_control_center_admin, + 'require_member_of_control_center_dashboard_reader': require_member_of_control_center_dashboard_reader } # --- Prevent Legacy Fields from Being Created/Updated --- diff --git a/application/single_app/route_frontend_authentication.py b/application/single_app/route_frontend_authentication.py index 44834985..022ecf84 100644 --- a/application/single_app/route_frontend_authentication.py +++ b/application/single_app/route_frontend_authentication.py @@ -30,9 +30,7 @@ def build_front_door_urls(front_door_url): def register_route_frontend_authentication(app): @app.route('/login') - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) def login(): # Clear potentially stale cache/user info before starting new login session.pop("user", None) @@ -71,9 +69,7 @@ def login(): return redirect(auth_url) @app.route('/getAToken') # This is your redirect URI path - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) def authorized(): # Check for errors passed back from Azure AD if request.args.get('error'): @@ -138,7 +134,7 @@ def authorized(): if user_id: log_user_login(user_id, 'azure_ad') except Exception as e: - current_app.logger.warning(f"Could not log login activity: {e}") + debug_print(f"Could not log login activity: {e}") # Redirect to the originally intended page or home # You might want to store the original destination in the session during /login @@ -167,9 +163,7 @@ def authorized(): # This route is for API calls that need a token, not the web app login flow. This does not kick off a session. @app.route('/getATokenApi') # This is your redirect URI path - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) def authorized_api(): # Check for errors passed back from Azure AD if request.args.get('error'): @@ -214,9 +208,7 @@ def authorized_api(): return jsonify(result, 200) @app.route('/logout') - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) def logout(): user_name = session.get("user", {}).get("name", "User") # Get the user's email before clearing the session diff --git a/application/single_app/route_frontend_chats.py b/application/single_app/route_frontend_chats.py index 601f7bc0..af3ce9b1 100644 --- a/application/single_app/route_frontend_chats.py +++ b/application/single_app/route_frontend_chats.py @@ -12,9 +12,7 @@ def register_route_frontend_chats(app): @app.route('/chats', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def chats(): @@ -60,9 +58,7 @@ def chats(): ) @app.route('/upload', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @file_upload_required @@ -435,9 +431,7 @@ def upload_file(): # THIS IS THE OLD ROUTE, KEEPING IT FOR REFERENCE, WILL DELETE LATER @app.route("/view_pdf", methods=["GET"]) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def view_pdf(): @@ -592,9 +586,7 @@ def view_pdf(): # --- Updated route --- @app.route('/view_document') - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def view_document(): diff --git a/application/single_app/route_frontend_control_center.py b/application/single_app/route_frontend_control_center.py index cf6bcc89..7215a60a 100644 --- a/application/single_app/route_frontend_control_center.py +++ b/application/single_app/route_frontend_control_center.py @@ -7,13 +7,13 @@ from swagger_wrapper import swagger_route, get_auth_security from datetime import datetime, timedelta import json +from functions_debug import debug_print def register_route_frontend_control_center(app): @app.route('/admin/control-center', methods=['GET']) @swagger_route(security=get_auth_security()) @login_required - @admin_required - @control_center_admin_required + @control_center_required('dashboard') def control_center(): """ Control Center main page for administrators. @@ -27,14 +27,48 @@ def control_center(): # Get basic statistics for dashboard stats = get_control_center_statistics() + # Check user's role for frontend conditional rendering + user = session.get('user', {}) + has_admin_role = 'ControlCenterAdmin' in user.get('roles', []) + return render_template('control_center.html', app_settings=public_settings, settings=public_settings, - statistics=stats) + statistics=stats, + has_control_center_admin=has_admin_role) except Exception as e: - current_app.logger.error(f"Error loading control center: {e}") + debug_print(f"Error loading control center: {e}") flash(f"Error loading control center: {str(e)}", "error") return redirect(url_for('admin_settings')) + + @app.route('/approvals', methods=['GET']) + @login_required + @user_required + def approvals(): + """ + Approval Requests page accessible to group owners, admins, and control center admins. + Shows approval requests based on user's role and permissions. + """ + try: + # Get settings for configuration data + settings = get_settings() + public_settings = sanitize_settings_for_user(settings) + + # Get user settings for profile and navigation + user_id = get_current_user_id() + user_settings = get_user_settings(user_id) + + return render_template('approvals.html', + app_settings=public_settings, + settings=public_settings, + user_settings=user_settings) + except Exception as e: + import traceback + error_trace = traceback.format_exc() + debug_print(f"Error loading approvals: {e}\n{error_trace}") + print(f"ERROR IN APPROVALS ROUTE: {e}\n{error_trace}") + flash(f"Error loading approvals: {str(e)}", "error") + return redirect(url_for('index')) def get_control_center_statistics(): """ @@ -66,24 +100,27 @@ def get_control_center_statistics(): )) stats['total_users'] = user_result[0] if user_result else 0 except Exception as e: - current_app.logger.warning(f"Could not get user count: {e}") + debug_print(f"Could not get user count: {e}") - # Get active users in last 30 days using lastUpdated + # Get active users in last 30 days using login activity logs try: thirty_days_ago = (datetime.now() - timedelta(days=30)).isoformat() active_users_query = """ - SELECT VALUE COUNT(1) FROM c - WHERE c.lastUpdated >= @thirty_days_ago + SELECT VALUE COUNT(1) FROM ( + SELECT DISTINCT c.user_id FROM c + WHERE c.activity_type = 'user_login' + AND c.timestamp >= @thirty_days_ago + ) """ active_users_params = [{"name": "@thirty_days_ago", "value": thirty_days_ago}] - active_users_result = list(cosmos_user_settings_container.query_items( + active_users_result = list(cosmos_activity_logs_container.query_items( query=active_users_query, parameters=active_users_params, enable_cross_partition_query=True )) stats['active_users_30_days'] = active_users_result[0] if active_users_result else 0 except Exception as e: - current_app.logger.warning(f"Could not get active users count: {e}") + debug_print(f"Could not get active users count: {e}") # Get total groups count try: @@ -94,7 +131,7 @@ def get_control_center_statistics(): )) stats['total_groups'] = groups_result[0] if groups_result else 0 except Exception as e: - current_app.logger.warning(f"Could not get groups count: {e}") + debug_print(f"Could not get groups count: {e}") # Get groups created in last 30 days using createdDate try: @@ -111,7 +148,7 @@ def get_control_center_statistics(): )) stats['locked_groups'] = new_groups_result[0] if new_groups_result else 0 except Exception as e: - current_app.logger.warning(f"Could not get new groups count: {e}") + debug_print(f"Could not get new groups count: {e}") # Get total public workspaces count try: @@ -122,7 +159,7 @@ def get_control_center_statistics(): )) stats['total_public_workspaces'] = workspaces_result[0] if workspaces_result else 0 except Exception as e: - current_app.logger.warning(f"Could not get public workspaces count: {e}") + debug_print(f"Could not get public workspaces count: {e}") # Get public workspaces created in last 30 days using createdDate try: @@ -139,7 +176,7 @@ def get_control_center_statistics(): )) stats['hidden_workspaces'] = new_workspaces_result[0] if new_workspaces_result else 0 except Exception as e: - current_app.logger.warning(f"Could not get new public workspaces count: {e}") + debug_print(f"Could not get new public workspaces count: {e}") # Get blocked users count try: @@ -153,7 +190,7 @@ def get_control_center_statistics(): )) stats['blocked_users'] = blocked_result[0] if blocked_result else 0 except Exception as e: - current_app.logger.warning(f"Could not get blocked users count: {e}") + debug_print(f"Could not get blocked users count: {e}") # Get recent activity (last 24 hours) try: @@ -200,7 +237,7 @@ def get_control_center_statistics(): stats['recent_activity_24h']['documents'] = recent_docs[0] if recent_docs else 0 except Exception as e: - current_app.logger.warning(f"Could not get recent activity: {e}") + debug_print(f"Could not get recent activity: {e}") # Add alerts for blocked users if stats['blocked_users'] > 0: @@ -213,7 +250,7 @@ def get_control_center_statistics(): return stats except Exception as e: - current_app.logger.error(f"Error getting control center statistics: {e}") + debug_print(f"Error getting control center statistics: {e}") return { 'total_users': 0, 'active_users_30_days': 0, diff --git a/application/single_app/route_frontend_conversations.py b/application/single_app/route_frontend_conversations.py index 977c8779..a5d3f261 100644 --- a/application/single_app/route_frontend_conversations.py +++ b/application/single_app/route_frontend_conversations.py @@ -8,9 +8,7 @@ def register_route_frontend_conversations(app): @app.route('/conversations') - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def conversations(): @@ -31,9 +29,7 @@ def conversations(): return render_template('conversations.html', conversations=items) @app.route('/conversation/', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def view_conversation(conversation_id): @@ -60,9 +56,7 @@ def view_conversation(conversation_id): return render_template('chat.html', conversation_id=conversation_id, messages=messages) @app.route('/conversation//messages', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def get_conversation_messages(conversation_id): @@ -197,9 +191,7 @@ def get_conversation_messages(conversation_id): return jsonify({'messages': messages}) @app.route('/api/message//metadata', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def get_message_metadata(message_id): diff --git a/application/single_app/route_frontend_feedback.py b/application/single_app/route_frontend_feedback.py index 50f405cc..15f13274 100644 --- a/application/single_app/route_frontend_feedback.py +++ b/application/single_app/route_frontend_feedback.py @@ -8,11 +8,8 @@ def register_route_frontend_feedback(app): @app.route("/admin/feedback_review") - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required - @admin_required @feedback_admin_required @enabled_required("enable_user_feedback") def admin_feedback_review(): @@ -23,9 +20,7 @@ def admin_feedback_review(): return render_template("admin_feedback_review.html") @app.route("/my_feedback") - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_user_feedback") diff --git a/application/single_app/route_frontend_group_workspaces.py b/application/single_app/route_frontend_group_workspaces.py index 1f95a74e..75996d84 100644 --- a/application/single_app/route_frontend_group_workspaces.py +++ b/application/single_app/route_frontend_group_workspaces.py @@ -7,9 +7,7 @@ def register_route_frontend_group_workspaces(app): @app.route('/group_workspaces', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_group_workspaces") @@ -46,6 +44,18 @@ def group_workspaces(): ) ) legacy_count = legacy_docs_from_cosmos[0] if legacy_docs_from_cosmos else 0 + + # Build allowed extensions string + allowed_extensions = [ + "txt", "pdf", "doc", "docm", "docx", "xlsx", "xls", "xlsm","csv", "pptx", "html", + "jpg", "jpeg", "png", "bmp", "tiff", "tif", "heif", "md", "json", + "xml", "yaml", "yml", "log" + ] + if enable_video_file_support in [True, 'True', 'true']: + allowed_extensions += ["mp4", "mov", "avi", "wmv", "mkv", "webm"] + if enable_audio_file_support in [True, 'True', 'true']: + allowed_extensions += ["mp3", "wav", "ogg", "aac", "flac", "m4a"] + allowed_extensions_str = "Allowed: " + ", ".join(allowed_extensions) # Build allowed extensions string allowed_extensions = [ @@ -72,9 +82,7 @@ def group_workspaces(): ) @app.route('/set_active_group', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_group_workspaces") diff --git a/application/single_app/route_frontend_groups.py b/application/single_app/route_frontend_groups.py index 900e729f..d48ec561 100644 --- a/application/single_app/route_frontend_groups.py +++ b/application/single_app/route_frontend_groups.py @@ -7,9 +7,7 @@ def register_route_frontend_groups(app): @app.route("/my_groups", methods=["GET"]) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_group_workspaces") @@ -30,9 +28,7 @@ def my_groups(): return render_template("my_groups.html", can_create_groups=can_create_groups) @app.route("/groups/", methods=["GET"]) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_group_workspaces") diff --git a/application/single_app/route_frontend_notifications.py b/application/single_app/route_frontend_notifications.py new file mode 100644 index 00000000..9897f01c --- /dev/null +++ b/application/single_app/route_frontend_notifications.py @@ -0,0 +1,28 @@ +# route_frontend_notifications.py + +from config import * +from functions_authentication import * +from functions_settings import * +from swagger_wrapper import swagger_route, get_auth_security + +def register_route_frontend_notifications(app): + + @app.route("/notifications") + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def notifications(): + """ + Renders the notifications page for the current user. + """ + settings = get_settings() + public_settings = sanitize_settings_for_user(settings) + user_id = get_current_user_id() + user_settings = get_user_settings(user_id) + + return render_template( + "notifications.html", + app_settings=public_settings, + settings=public_settings, + user_settings=user_settings + ) diff --git a/application/single_app/route_frontend_profile.py b/application/single_app/route_frontend_profile.py index 8b152745..f03c92bb 100644 --- a/application/single_app/route_frontend_profile.py +++ b/application/single_app/route_frontend_profile.py @@ -3,21 +3,18 @@ from config import * from functions_authentication import * from swagger_wrapper import swagger_route, get_auth_security +import traceback def register_route_frontend_profile(app): @app.route('/profile') - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required def profile(): user = session.get('user') return render_template('profile.html', user=user) @app.route('/api/profile/image/refresh', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def refresh_profile_image(): @@ -58,5 +55,308 @@ def refresh_profile_image(): return jsonify({"error": "Failed to update profile image settings"}), 500 except Exception as e: - print(f"Error refreshing profile image for user {user_id}: {e}") - return jsonify({"error": "Internal server error"}), 500 \ No newline at end of file + debug_print(f"Error refreshing profile image for user {user_id}: {e}") + log_event(f"Error refreshing profile image for user {user_id}: {str(e)}", level=logging.ERROR) + return jsonify({"error": "Internal server error"}), 500 + + @app.route('/api/user/activity-trends', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def get_user_activity_trends(): + """ + Get time-series activity trends for the current user over the last 30 days. + Returns data for login activity, conversation creation, document uploads, and token usage. + """ + try: + from datetime import datetime, timezone, timedelta + from collections import defaultdict + from config import cosmos_activity_logs_container, cosmos_conversations_container + from config import cosmos_user_documents_container, cosmos_messages_container + + user_id = get_current_user_id() + if not user_id: + return jsonify({"error": "Unable to identify user"}), 401 + + # Calculate date range for last 30 days + end_date = datetime.now(timezone.utc) + start_date = end_date - timedelta(days=30) + + # Initialize data structures for daily aggregation + logins_by_date = defaultdict(int) + conversations_by_date = defaultdict(int) + conversations_delete_by_date = defaultdict(int) + documents_upload_by_date = defaultdict(int) + documents_delete_by_date = defaultdict(int) + tokens_by_date = defaultdict(int) + + # Query 1: Get login activity from activity_logs + try: + login_query = """ + SELECT c.timestamp, c.created_at FROM c + WHERE c.user_id = @user_id + AND c.activity_type = 'user_login' + AND (c.timestamp >= @start_date OR c.created_at >= @start_date) + """ + login_params = [ + {"name": "@user_id", "value": user_id}, + {"name": "@start_date", "value": start_date.isoformat()} + ] + login_records = list(cosmos_activity_logs_container.query_items( + query=login_query, + parameters=login_params, + enable_cross_partition_query=True + )) + + for record in login_records: + timestamp = record.get('timestamp') or record.get('created_at') + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + date_key = dt.strftime('%Y-%m-%d') + logins_by_date[date_key] += 1 + except: + pass + except Exception as e: + debug_print(f"Error fetching login trends: {e}") + log_event(f"Error fetching login trends: {str(e)}", level=logging.ERROR) + + # Query 2: Get conversation creation activity from activity_logs + try: + conv_query = """ + SELECT c.timestamp, c.created_at FROM c + WHERE c.user_id = @user_id + AND c.activity_type = 'conversation_creation' + AND (c.timestamp >= @start_date OR c.created_at >= @start_date) + """ + conv_params = [ + {"name": "@user_id", "value": user_id}, + {"name": "@start_date", "value": start_date.isoformat()} + ] + conv_records = list(cosmos_activity_logs_container.query_items( + query=conv_query, + parameters=conv_params, + enable_cross_partition_query=True + )) + + for record in conv_records: + timestamp = record.get('timestamp') or record.get('created_at') + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + date_key = dt.strftime('%Y-%m-%d') + conversations_by_date[date_key] += 1 + except: + pass + except Exception as e: + debug_print(f"Error fetching conversation trends: {e}") + log_event(f"Error fetching conversation trends: {str(e)}", level=logging.ERROR) + + # Query 2b: Get conversation deletion activity from activity_logs + try: + conv_delete_query = """ + SELECT c.timestamp, c.created_at FROM c + WHERE c.user_id = @user_id + AND c.activity_type = 'conversation_deletion' + AND (c.timestamp >= @start_date OR c.created_at >= @start_date) + """ + conv_delete_records = list(cosmos_activity_logs_container.query_items( + query=conv_delete_query, + parameters=conv_params, + enable_cross_partition_query=True + )) + + for record in conv_delete_records: + timestamp = record.get('timestamp') or record.get('created_at') + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + date_key = dt.strftime('%Y-%m-%d') + conversations_delete_by_date[date_key] += 1 + except: + pass + except Exception as e: + debug_print(f"Error fetching conversation deletion trends: {e}") + log_event(f"Error fetching conversation deletion trends: {str(e)}", level=logging.ERROR) + + # Query 3: Get document upload activity from activity_logs + try: + doc_upload_query = """ + SELECT c.timestamp, c.created_at FROM c + WHERE c.user_id = @user_id + AND c.activity_type = 'document_creation' + AND (c.timestamp >= @start_date OR c.created_at >= @start_date) + """ + doc_params = [ + {"name": "@user_id", "value": user_id}, + {"name": "@start_date", "value": start_date.isoformat()} + ] + doc_records = list(cosmos_activity_logs_container.query_items( + query=doc_upload_query, + parameters=doc_params, + enable_cross_partition_query=True + )) + + for record in doc_records: + timestamp = record.get('timestamp') or record.get('created_at') + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + date_key = dt.strftime('%Y-%m-%d') + documents_upload_by_date[date_key] += 1 + except: + pass + except Exception as e: + debug_print(f"Error fetching document upload trends: {e}") + log_event(f"Error fetching document upload trends: {str(e)}", level=logging.ERROR) + + # Query 3b: Get document delete activity from activity_logs + try: + doc_delete_query = """ + SELECT c.timestamp, c.created_at FROM c + WHERE c.user_id = @user_id + AND c.activity_type = 'document_deletion' + AND (c.timestamp >= @start_date OR c.created_at >= @start_date) + """ + doc_delete_records = list(cosmos_activity_logs_container.query_items( + query=doc_delete_query, + parameters=doc_params, + enable_cross_partition_query=True + )) + + for record in doc_delete_records: + timestamp = record.get('timestamp') or record.get('created_at') + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + date_key = dt.strftime('%Y-%m-%d') + documents_delete_by_date[date_key] += 1 + except: + pass + except Exception as e: + debug_print(f"Error fetching document delete trends: {e}") + log_event(f"Error fetching document delete trends: {str(e)}", level=logging.ERROR) + + # Query 4: Get token usage from activity_logs + try: + token_query = """ + SELECT c.timestamp, c.created_at, c.usage FROM c + WHERE c.user_id = @user_id + AND c.activity_type = 'token_usage' + AND (c.timestamp >= @start_date OR c.created_at >= @start_date) + """ + token_params = [ + {"name": "@user_id", "value": user_id}, + {"name": "@start_date", "value": start_date.isoformat()} + ] + token_records = list(cosmos_activity_logs_container.query_items( + query=token_query, + parameters=token_params, + enable_cross_partition_query=True + )) + + for record in token_records: + timestamp = record.get('timestamp') or record.get('created_at') + if timestamp: + try: + dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00')) + date_key = dt.strftime('%Y-%m-%d') + # Extract total tokens from usage field + usage = record.get('usage', {}) + total_tokens = usage.get('total_tokens', 0) + tokens_by_date[date_key] += total_tokens + except: + pass + except Exception as e: + debug_print(f"Error fetching token usage trends: {e}") + log_event(f"Error fetching token usage trends: {str(e)}", level=logging.ERROR) + + # Generate complete date range (last 30 days) + date_range = [] + for i in range(30): + date = end_date - timedelta(days=29-i) + date_range.append(date.strftime('%Y-%m-%d')) + + # Format data for Chart.js + logins_data = [{"date": date, "count": logins_by_date.get(date, 0)} for date in date_range] + conversations_data = { + "creates": [{"date": date, "count": conversations_by_date.get(date, 0)} for date in date_range], + "deletes": [{"date": date, "count": conversations_delete_by_date.get(date, 0)} for date in date_range] + } + documents_data = { + "uploads": [{"date": date, "count": documents_upload_by_date.get(date, 0)} for date in date_range], + "deletes": [{"date": date, "count": documents_delete_by_date.get(date, 0)} for date in date_range] + } + tokens_data = [{"date": date, "tokens": tokens_by_date.get(date, 0)} for date in date_range] + + # Get storage metrics from user settings + from functions_settings import get_user_settings + user_settings = get_user_settings(user_id) + metrics = user_settings.get('settings', {}).get('metrics', {}) + document_metrics = metrics.get('document_metrics', {}) + + storage_data = { + "ai_search_size": document_metrics.get('ai_search_size', 0), + "storage_account_size": document_metrics.get('storage_account_size', 0) + } + + return jsonify({ + "success": True, + "logins": logins_data, + "conversations": conversations_data, + "documents": documents_data, + "tokens": tokens_data, + "storage": storage_data + }), 200 + + except Exception as e: + debug_print(f"Error fetching user activity trends: {e}") + log_event(f"Error fetching user activity trends: {str(e)}", level=logging.ERROR) + traceback.print_exc() + return jsonify({"error": "Failed to fetch activity trends"}), 500 + + @app.route('/api/user/settings', methods=['GET']) + @swagger_route(security=get_auth_security()) + @login_required + @user_required + def get_user_settings_api(): + """ + Get current user's settings including cached metrics. + """ + try: + from functions_settings import get_user_settings + + user_id = get_current_user_id() + if not user_id: + return jsonify({"error": "Unable to identify user"}), 401 + + user_settings = get_user_settings(user_id) + + # Extract relevant data for frontend + settings = user_settings.get('settings', {}) + metrics = settings.get('metrics', {}) + + # Return ALL settings from Cosmos for backwards compatibility + # This matches the old API behavior: return jsonify(user_settings_data), 200 + response_data = { + "success": True, + "settings": settings, # Return entire settings object + "metrics": metrics, + "retention_policy": { + "enabled": settings.get('retention_policy_enabled', False), + "days": settings.get('retention_policy_days', 30) + }, + "display_name": user_settings.get('display_name'), + "email": user_settings.get('email'), + "lastUpdated": user_settings.get('lastUpdated'), + # Add at root level for backwards compatibility with agents code + "selected_agent": settings.get('selected_agent') + } + + return jsonify(response_data), 200 + + except Exception as e: + debug_print(f"Error fetching user settings: {e}") + log_event(f"Error fetching user settings: {str(e)}", level=logging.ERROR) + traceback.print_exc() + return jsonify({"error": "Failed to fetch user settings"}), 500 \ No newline at end of file diff --git a/application/single_app/route_frontend_public_workspaces.py b/application/single_app/route_frontend_public_workspaces.py index 0b9b208e..10235444 100644 --- a/application/single_app/route_frontend_public_workspaces.py +++ b/application/single_app/route_frontend_public_workspaces.py @@ -7,14 +7,13 @@ def register_route_frontend_public_workspaces(app): @app.route("/my_public_workspaces", methods=["GET"]) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_public_workspaces") def my_public_workspaces(): user = session.get('user', {}) + user_id = get_current_user_id() settings = get_settings() require_member_of_create_public_workspace = settings.get("require_member_of_create_public_workspace", False) @@ -23,18 +22,21 @@ def my_public_workspaces(): if require_member_of_create_public_workspace: can_create_public_workspaces = 'roles' in user and 'CreatePublicWorkspaces' in user['roles'] + # Get user settings to retrieve active public workspace ID + user_settings = get_user_settings(user_id) + active_public_workspace_id = user_settings.get("settings", {}).get("activePublicWorkspaceOid", "") + public_settings = sanitize_settings_for_user(settings) return render_template( "my_public_workspaces.html", settings=public_settings, app_settings=public_settings, - can_create_public_workspaces=can_create_public_workspaces + can_create_public_workspaces=can_create_public_workspaces, + active_public_workspace_id=active_public_workspace_id ) @app.route("/public_workspaces/", methods=["GET"]) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_public_workspaces") @@ -49,9 +51,7 @@ def manage_public_workspace(workspace_id): ) @app.route("/public_workspaces", methods=["GET"]) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_public_workspaces") @@ -93,9 +93,7 @@ def public_workspaces(): ) @app.route("/public_directory", methods=["GET"]) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_public_workspaces") @@ -114,9 +112,7 @@ def public_directory(): ) @app.route('/set_active_public_workspace', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_public_workspaces") diff --git a/application/single_app/route_frontend_safety.py b/application/single_app/route_frontend_safety.py index 9dfb1f38..32773199 100644 --- a/application/single_app/route_frontend_safety.py +++ b/application/single_app/route_frontend_safety.py @@ -8,11 +8,8 @@ def register_route_frontend_safety(app): @app.route('/admin/safety_violations', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required - @admin_required @safety_violation_admin_required @enabled_required("enable_content_safety") def admin_safety_violations(): @@ -22,9 +19,7 @@ def admin_safety_violations(): return render_template('admin_safety_violations.html') @app.route('/safety_violations', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_content_safety") diff --git a/application/single_app/route_frontend_workspace.py b/application/single_app/route_frontend_workspace.py index dc9bb813..47f121e0 100644 --- a/application/single_app/route_frontend_workspace.py +++ b/application/single_app/route_frontend_workspace.py @@ -7,9 +7,7 @@ def register_route_frontend_workspace(app): @app.route('/workspace', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required @enabled_required("enable_user_workspace") @@ -44,7 +42,7 @@ def workspace(): ) ) legacy_count = legacy_docs_from_cosmos[0] if legacy_docs_from_cosmos else 0 - + # Build allowed extensions string allowed_extensions = [ "txt", "pdf", "doc", "docm", "docx", "xlsx", "xls", "xlsm","csv", "pptx", "html", diff --git a/application/single_app/route_openapi.py b/application/single_app/route_openapi.py index 08a86f41..238e9a4c 100644 --- a/application/single_app/route_openapi.py +++ b/application/single_app/route_openapi.py @@ -15,15 +15,13 @@ from openapi_auth_analyzer import analyze_openapi_authentication, get_authentication_help_text from swagger_wrapper import swagger_route, get_auth_security from functions_security import is_valid_storage_name - +from functions_debug import debug_print def register_openapi_routes(app): """Register OpenAPI-related routes.""" @app.route('/api/openapi/upload', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def upload_openapi_spec(): @@ -132,16 +130,14 @@ def upload_openapi_spec(): os.unlink(temp_path) except Exception as e: - current_app.logger.error(f"Error uploading OpenAPI spec: {str(e)}") + debug_print(f"Error uploading OpenAPI spec: {str(e)}") return jsonify({ 'success': False, 'error': 'Internal server error during upload' }), 500 @app.route('/api/openapi/validate-url', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def validate_openapi_url(): @@ -232,16 +228,14 @@ def validate_openapi_url(): }) except Exception as e: - current_app.logger.error(f"Error validating OpenAPI URL: {str(e)}") + debug_print(f"Error validating OpenAPI URL: {str(e)}") return jsonify({ 'success': False, 'error': 'Internal server error during validation' }), 500 @app.route('/api/openapi/download-from-url', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def download_openapi_from_url(): @@ -344,16 +338,14 @@ def download_openapi_from_url(): }) except Exception as e: - current_app.logger.error(f"Error downloading OpenAPI spec from URL: {str(e)}") + debug_print(f"Error downloading OpenAPI spec from URL: {str(e)}") return jsonify({ 'success': False, 'error': 'Internal server error during download' }), 500 @app.route('/api/openapi/list-uploaded', methods=['GET']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def list_uploaded_specs(): @@ -394,7 +386,7 @@ def list_uploaded_specs(): 'last_modified': os.path.getmtime(file_path) }) except Exception as e: - current_app.logger.warning(f"Could not read spec file {filename}: {str(e)}") + debug_print(f"Could not read spec file {filename}: {str(e)}") continue return jsonify({ @@ -403,16 +395,14 @@ def list_uploaded_specs(): }) except Exception as e: - current_app.logger.error(f"Error listing OpenAPI specs: {str(e)}") + debug_print(f"Error listing OpenAPI specs: {str(e)}") return jsonify({ 'success': False, 'error': 'Internal server error while listing specifications' }), 500 @app.route('/api/openapi/analyze-auth', methods=['POST']) - @swagger_route( - security=get_auth_security() - ) + @swagger_route(security=get_auth_security()) @login_required @user_required def analyze_openapi_auth(): @@ -453,7 +443,7 @@ def analyze_openapi_auth(): }) except Exception as e: - current_app.logger.error(f"Error analyzing authentication: {str(e)}") + debug_print(f"Error analyzing authentication: {str(e)}") return jsonify({ 'success': False, 'error': 'Internal server error during authentication analysis' diff --git a/application/single_app/static/css/chat-speech-input.css b/application/single_app/static/css/chat-speech-input.css new file mode 100644 index 00000000..eaeba8f1 --- /dev/null +++ b/application/single_app/static/css/chat-speech-input.css @@ -0,0 +1,112 @@ +/* chat-speech-input.css */ +/* Styles for speech-to-text chat input feature */ + +/* Speech input button positioning */ +#speech-input-btn { + width: 36px; + height: 36px; + border-radius: 0.375rem; + padding: 0; + display: flex; + align-items: center; + justify-content: center; + z-index: 2; +} + +#speech-input-btn:hover { + background-color: #e9ecef; + border-color: #adb5bd; +} + +#speech-input-btn:active { + background-color: #dee2e6; +} + +/* Recording UI Container */ +#recording-container { + min-height: 38px; +} + +.recording-ui { + padding: 8px 12px; + background: #f8f9fa; + border-radius: 0.375rem; + border: 1px solid #0d6efd; +} + +/* Waveform Canvas */ +#waveform-canvas { + height: 36px; + background-color: #ffffff; + border-radius: 0.25rem; + border: 1px solid #dee2e6; +} + +/* Recording Buttons */ +.recording-buttons .btn { + min-width: 36px; + width: 36px; + height: 36px; + font-weight: 600; + display: flex; + align-items: center; + justify-content: center; + padding: 0; + border-radius: 0.375rem; +} + +.recording-buttons .btn i { + font-size: 1rem; +} + +/* Adjust textarea padding when speech button is visible */ +#user-input { + padding-left: 55px !important; +} + +/* Dark mode support */ +[data-bs-theme="dark"] #speech-input-btn { + background-color: transparent; + border-color: #6c757d; + color: #adb5bd; +} + +[data-bs-theme="dark"] #speech-input-btn:hover { + background-color: #343a40; + border-color: #adb5bd; +} + +[data-bs-theme="dark"] #speech-input-btn:active { + background-color: #495057; +} + +[data-bs-theme="dark"] .recording-ui { + background: #212529; + border-color: #0d6efd; +} + +[data-bs-theme="dark"] #waveform-canvas { + background-color: #343a40; + border-color: #495057; +} + +/* Responsive adjustments */ +@media (max-width: 768px) { + .recording-controls { + flex-direction: column; + gap: 12px; + } + + .recording-buttons { + width: 100%; + } + + .recording-buttons .btn { + flex: 1; + } + + .countdown-timer { + font-size: 1.5rem; + width: 100%; + } +} diff --git a/application/single_app/static/css/chats.css b/application/single_app/static/css/chats.css index d86fe287..38e11c3a 100644 --- a/application/single_app/static/css/chats.css +++ b/application/single_app/static/css/chats.css @@ -1613,4 +1613,67 @@ mark.search-highlight { .reasoning-level-label { font-weight: 600; font-size: 0.9rem; +} + +/* Text-to-Speech Styles */ +.tts-play-btn { + padding: 0.25rem 0.5rem; + transition: all 0.2s ease; +} + +.tts-play-btn:hover { + color: var(--bs-primary) !important; + transform: scale(1.1); +} + +.tts-play-btn.btn-success { + color: var(--bs-success) !important; +} + +.tts-play-btn.btn-warning { + color: var(--bs-warning) !important; +} + +.message.tts-playing .avatar { + animation: tts-avatar-pulse 0.3s ease-in-out infinite alternate; + border-radius: 50%; +} + +.message.tts-playing .avatar.volume-low { + box-shadow: 0 0 8px rgba(13, 110, 253, 0.4); +} + +.message.tts-playing .avatar.volume-medium { + box-shadow: 0 0 15px rgba(13, 110, 253, 0.6); +} + +.message.tts-playing .avatar.volume-high { + box-shadow: 0 0 25px rgba(13, 110, 253, 0.8); +} + +.message.tts-playing .avatar.volume-peak { + box-shadow: 0 0 35px rgba(13, 110, 253, 1); +} + +/* Word-by-word highlighting for TTS */ +.tts-word { + display: inline; + transition: background-color 0.2s ease, color 0.2s ease; +} + +.tts-word.tts-current-word { + background-color: rgba(var(--bs-primary-rgb), 0.3); + color: var(--bs-primary); + font-weight: 500; + border-radius: 2px; + padding: 0 2px; +} + +@keyframes tts-avatar-pulse { + 0% { + transform: scale(1); + } + 100% { + transform: scale(1.05); + } } \ No newline at end of file diff --git a/application/single_app/static/js/admin/admin_settings.js b/application/single_app/static/js/admin/admin_settings.js index 2864bd3d..6b3ed8c2 100644 --- a/application/single_app/static/js/admin/admin_settings.js +++ b/application/single_app/static/js/admin/admin_settings.js @@ -1565,6 +1565,15 @@ function setupToggles() { }); } + const enableKeyVaultCheckbox = document.getElementById('enable_key_vault_secret_storage'); + if (enableKeyVaultCheckbox) { + enableKeyVaultCheckbox.addEventListener('change', function() { + const keyVaultSettings = document.getElementById('key_vault_settings'); + keyVaultSettings.style.display = this.checked ? 'block' : 'none'; + markFormAsModified(); + }); + } + const enableWebSearch = document.getElementById('enable_web_search'); if (enableWebSearch) { enableWebSearch.addEventListener('change', function () { diff --git a/application/single_app/static/js/admin/admin_sidebar_nav.js b/application/single_app/static/js/admin/admin_sidebar_nav.js index de71f7b0..72965781 100644 --- a/application/single_app/static/js/admin/admin_sidebar_nav.js +++ b/application/single_app/static/js/admin/admin_sidebar_nav.js @@ -203,6 +203,8 @@ function scrollToSection(sectionId) { 'user-feedback-section': 'user-feedback-section', 'permissions-section': 'permissions-section', 'conversation-archiving-section': 'conversation-archiving-section', + // Security tab sections + 'keyvault-section': 'keyvault-section', // Search & Extract tab sections 'azure-ai-search-section': 'azure-ai-search-section', 'document-intelligence-section': 'document-intelligence-section', diff --git a/application/single_app/static/js/chat/chat-conversations.js b/application/single_app/static/js/chat/chat-conversations.js index b85675c3..7d1990cf 100644 --- a/application/single_app/static/js/chat/chat-conversations.js +++ b/application/single_app/static/js/chat/chat-conversations.js @@ -992,6 +992,14 @@ export function deleteConversation(conversationId) { export async function createNewConversation(callback) { // Disable new button? Show loading? if (newConversationBtn) newConversationBtn.disabled = true; + + // Clear the chatbox immediately when creating new conversation + const chatbox = document.getElementById("chatbox"); + if (chatbox && !callback) { + // Only clear if there's no callback (i.e., not sending a message immediately) + chatbox.innerHTML = ""; + } + try { const response = await fetch("/api/create_conversation", { method: "POST", @@ -1012,14 +1020,24 @@ export async function createNewConversation(callback) { currentConversationId = data.conversation_id; // Add to list (pass empty classifications for new convo) addConversationToList(data.conversation_id, data.title /* Use title from API if provided */, []); - // Select the new conversation to update header and chatbox - selectConversation(data.conversation_id); + + // Don't call selectConversation here if we're about to send a message + // because selectConversation clears the chatbox, which would remove + // the user message that's about to be appended by actuallySendMessage + // Instead, just update the UI elements directly + window.currentConversationId = data.conversation_id; + const titleEl = document.getElementById("current-conversation-title"); + if (titleEl) { + titleEl.textContent = data.title || "New Conversation"; + } + console.log('[createNewConversation] Created conversation without reload:', data.conversation_id); // Execute callback if provided (e.g., to send the first message) if (typeof callback === "function") { callback(); } + } catch (error) { console.error("Error creating conversation:", error); showToast(`Failed to create a new conversation: ${error.message}`, "danger"); diff --git a/application/single_app/static/js/chat/chat-documents.js b/application/single_app/static/js/chat/chat-documents.js index c2dbefad..174a7c7d 100644 --- a/application/single_app/static/js/chat/chat-documents.js +++ b/application/single_app/static/js/chat/chat-documents.js @@ -1,7 +1,7 @@ // chat-documents.js -import { showToast } from "./chat-toast.js"; // Assuming you have this -import { toBoolean } from "./chat-utils.js"; // Import the toBoolean helper +import { showToast } from "./chat-toast.js"; +import { toBoolean } from "./chat-utils.js"; export const docScopeSelect = document.getElementById("doc-scope-select"); const searchDocumentsBtn = document.getElementById("search-documents-btn"); diff --git a/application/single_app/static/js/chat/chat-messages.js b/application/single_app/static/js/chat/chat-messages.js index 3ff0f070..48fc6166 100644 --- a/application/single_app/static/js/chat/chat-messages.js +++ b/application/single_app/static/js/chat/chat-messages.js @@ -15,11 +15,24 @@ import { import { updateSidebarConversationTitle } from "./chat-sidebar-conversations.js"; import { escapeHtml, isColorLight, addTargetBlankToExternalLinks } from "./chat-utils.js"; import { showToast } from "./chat-toast.js"; +import { autoplayTTSIfEnabled } from "./chat-tts.js"; import { saveUserSetting } from "./chat-layout.js"; import { isStreamingEnabled, sendMessageWithStreaming } from "./chat-streaming.js"; import { getCurrentReasoningEffort, isReasoningEffortEnabled } from './chat-reasoning.js'; import { areAgentsEnabled } from './chat-agents.js'; +// Conditionally import TTS if enabled +let ttsModule = null; +if (typeof window.appSettings !== 'undefined' && window.appSettings.enable_text_to_speech) { + import('./chat-tts.js').then(module => { + ttsModule = module; + console.log('TTS module loaded'); + module.initializeTTS(); + }).catch(error => { + console.error('Failed to load TTS module:', error); + }); +} + /** * Unwraps markdown tables that are mistakenly wrapped in code blocks. * This fixes the issue where AI responses contain tables in code blocks, @@ -545,7 +558,8 @@ export function appendMessage( agentCitations = [], agentDisplayName = null, agentName = null, - fullMessageObject = null + fullMessageObject = null, + isNewMessage = false ) { if (!chatbox || sender === "System") return; @@ -616,6 +630,16 @@ export function appendMessage( const maskIcon = isMasked ? 'bi-front' : 'bi-back'; const maskTitle = isMasked ? 'Unmask all masked content' : 'Mask entire message'; + // TTS button (only for AI messages) + const ttsButtonHtml = (sender === 'AI' && typeof window.appSettings !== 'undefined' && window.appSettings.enable_text_to_speech) ? ` + + ` : ''; + const copyButtonHtml = ` `; - const copyAndFeedbackHtml = `
${actionsDropdownHtml}${copyButtonHtml}${maskButtonHtml}${carouselButtonsHtml}
`; + const copyAndFeedbackHtml = `
${actionsDropdownHtml}${ttsButtonHtml}${copyButtonHtml}${maskButtonHtml}${carouselButtonsHtml}
`; const citationsButtonsHtml = createCitationsHtml( hybridCitations, @@ -744,6 +768,11 @@ export function appendMessage( messageDiv.classList.add(messageClass); // Add AI message class chatbox.appendChild(messageDiv); // Append AI message + // Auto-play TTS if enabled (only for new messages, not when loading history) + if (isNewMessage && typeof autoplayTTSIfEnabled === 'function') { + autoplayTTSIfEnabled(messageId, messageContent); + } + // Highlight code blocks in the messages messageDiv.querySelectorAll('pre code[class^="language-"]').forEach((block) => { const match = block.className.match(/language-([a-zA-Z0-9]+)/); @@ -1533,7 +1562,9 @@ export function actuallySendMessage(finalMessageToSend) { data.web_search_citations, // Pass web citations data.agent_citations, // Pass agent citations data.agent_display_name, // Pass agent display name - data.agent_name // Pass agent name + data.agent_name, // Pass agent name + null, // fullMessageObject + true // isNewMessage - trigger autoplay for new responses ); } // Show kernel fallback notice if present @@ -1611,12 +1642,17 @@ export function actuallySendMessage(finalMessageToSend) { } } else { // New conversation case + console.log('[sendMessage] New conversation created, adding to list without reload'); addConversationToList( currentConversationId, data.conversation_title, data.classification || [] ); - selectConversation(currentConversationId); // Select the newly added one + // Don't call selectConversation here - messages are already displayed + // Just update the current conversation ID and title + window.currentConversationId = currentConversationId; + document.getElementById("current-conversation-title").textContent = data.conversation_title || "New Conversation"; + console.log('[sendMessage] New conversation setup complete, conversation ID:', currentConversationId); } } }) diff --git a/application/single_app/static/js/chat/chat-onload.js b/application/single_app/static/js/chat/chat-onload.js index d8a9c332..2a83b20b 100644 --- a/application/single_app/static/js/chat/chat-onload.js +++ b/application/single_app/static/js/chat/chat-onload.js @@ -10,6 +10,7 @@ import { showToast } from "./chat-toast.js"; import { initConversationInfoButton } from "./chat-conversation-info-button.js"; import { initializeStreamingToggle } from "./chat-streaming.js"; import { initializeReasoningToggle } from "./chat-reasoning.js"; +import { initializeSpeechInput } from "./chat-speech-input.js"; window.addEventListener('DOMContentLoaded', () => { console.log("DOM Content Loaded. Starting initializations."); // Log start @@ -24,6 +25,13 @@ window.addEventListener('DOMContentLoaded', () => { // Initialize reasoning toggle initializeReasoningToggle(); + + // Initialize speech input + try { + initializeSpeechInput(); + } catch (error) { + console.warn('Speech input initialization failed:', error); + } // Grab references to the relevant elements const userInput = document.getElementById("user-input"); diff --git a/application/single_app/static/js/chat/chat-speech-input.js b/application/single_app/static/js/chat/chat-speech-input.js new file mode 100644 index 00000000..e9baab23 --- /dev/null +++ b/application/single_app/static/js/chat/chat-speech-input.js @@ -0,0 +1,980 @@ +// chat-speech-input.js +/** + * Speech-to-text chat input module + * Handles voice recording with visual waveform feedback and transcription + */ + +import { showToast } from './chat-toast.js'; +import { sendMessage } from './chat-messages.js'; +import { saveUserSetting } from './chat-layout.js'; + +let mediaRecorder = null; +let audioChunks = []; +let recordingStartTime = null; +let countdownInterval = null; +let autoSendTimeout = null; +let autoSendCountdown = null; +let audioContext = null; +let analyser = null; +let animationFrame = null; +let stream = null; +let waveformData = []; // Store waveform amplitudes over time +let isCanceling = false; // Flag to track if recording is being canceled +let microphonePermissionState = 'prompt'; // 'granted', 'denied', or 'prompt' +let userMicrophonePreference = 'ask-every-session'; // User's permission preference +let sessionPermissionRequested = false; // Track if permission was requested this session + +const MAX_RECORDING_DURATION = 90; // seconds +let remainingTime = MAX_RECORDING_DURATION; + +/** + * Check if browser supports required APIs + */ +function checkBrowserSupport() { + if (!navigator.mediaDevices || !navigator.mediaDevices.getUserMedia) { + return { supported: false, message: 'Your browser does not support audio recording' }; + } + + if (!window.MediaRecorder) { + return { supported: false, message: 'Your browser does not support MediaRecorder API' }; + } + + if (!window.AudioContext && !window.webkitAudioContext) { + return { supported: false, message: 'Your browser does not support Web Audio API' }; + } + + return { supported: true }; +} + +/** + * Initialize speech input functionality + */ +export function initializeSpeechInput() { + console.log('Initializing speech input...'); + + const speechBtn = document.getElementById('speech-input-btn'); + + if (!speechBtn) { + console.warn('Speech input button not found in DOM'); + return; // Speech input not enabled + } + + console.log('Speech input button found:', speechBtn); + + // Check browser support + const support = checkBrowserSupport(); + if (!support.supported) { + speechBtn.style.display = 'none'; + console.warn('Speech input disabled:', support.message); + return; + } + + console.log('Browser supports speech input'); + + // Load user microphone preferences + loadMicrophonePreference().then(() => { + // Check permission state and update icon + checkMicrophonePermissionState(); + }); + + // Attach event listener + speechBtn.addEventListener('click', handleSpeechButtonClick); + + // Attach recording control listeners + const cancelBtn = document.getElementById('cancel-recording-btn'); + const sendBtn = document.getElementById('send-recording-btn'); + + if (cancelBtn) { + cancelBtn.addEventListener('click', cancelRecording); + console.log('Cancel button listener attached'); + } + + if (sendBtn) { + sendBtn.addEventListener('click', stopAndSendRecording); + console.log('Send button listener attached'); + } + + console.log('Speech input initialization complete'); +} + +/** + * Handle speech button click - check permission state first + */ +async function handleSpeechButtonClick() { + console.log('Speech button clicked!'); + + // If permission is denied, navigate to profile settings + if (microphonePermissionState === 'denied') { + console.log('Microphone permission denied, redirecting to profile settings'); + window.location.href = '/profile#speech-settings'; + return; + } + + // Check if we should request permission based on user preference + if (shouldRequestPermission()) { + await checkMicrophonePermissionState(); + } + + // Start recording + startRecording(); +} + +/** + * Check if we should request permission based on user preference + */ +function shouldRequestPermission() { + switch (userMicrophonePreference) { + case 'remember': + // Only request once ever + return microphonePermissionState === 'prompt'; + case 'ask-every-session': + // Request once per browser session + return !sessionPermissionRequested; + case 'ask-every-page-load': + // Request on every page load + return true; + default: + return !sessionPermissionRequested; + } +} + +/** + * Load user's microphone permission preference from settings + */ +async function loadMicrophonePreference() { + try { + const response = await fetch('/api/user/settings'); + const data = await response.json(); + const settings = data.settings || {}; + + // Microphone permission preference removed - browser controls permission state + console.log('Loaded microphone preference:', userMicrophonePreference); + + return userMicrophonePreference; + } catch (error) { + console.error('Error loading microphone preference:', error); + userMicrophonePreference = 'ask-every-session'; + return userMicrophonePreference; + } +} + +/** + * Check microphone permission state and update UI + */ +async function checkMicrophonePermissionState() { + try { + // Try to get media to check permission state + const stream = await navigator.mediaDevices.getUserMedia({ audio: true }); + + // Permission granted + stream.getTracks().forEach(track => track.stop()); + microphonePermissionState = 'granted'; + sessionPermissionRequested = true; + updateMicrophoneIconState('granted'); + + // Save state if preference is 'remember' + if (userMicrophonePreference === 'remember') { + await savePermissionState('granted'); + } + + } catch (error) { + if (error.name === 'NotAllowedError' || error.name === 'PermissionDeniedError') { + microphonePermissionState = 'denied'; + sessionPermissionRequested = true; + updateMicrophoneIconState('denied'); + + // Save state if preference is 'remember' + if (userMicrophonePreference === 'remember') { + await savePermissionState('denied'); + } + } else { + console.error('Error checking microphone permission:', error); + microphonePermissionState = 'prompt'; + updateMicrophoneIconState('prompt'); + } + } +} + +/** + * Update microphone icon state with color and tooltip + */ +function updateMicrophoneIconState(state) { + const speechBtn = document.getElementById('speech-input-btn'); + if (!speechBtn) return; + + const icon = speechBtn.querySelector('i'); + if (!icon) return; + + // Remove existing state classes + icon.classList.remove('text-success', 'text-danger', 'text-secondary'); + + switch(state) { + case 'granted': + icon.classList.add('text-success'); + speechBtn.title = 'Voice Input (Microphone access granted)'; + break; + case 'denied': + icon.classList.add('text-danger'); + speechBtn.title = 'Microphone access denied - Click to manage permissions'; + break; + case 'prompt': + default: + icon.classList.add('text-secondary'); + speechBtn.title = 'Voice Input (Click to enable microphone)'; + break; + } + + console.log('Updated microphone icon state:', state); +} + +/** + * Save permission state to user settings + */ +async function savePermissionState(state) { + try { + await saveUserSetting({ + microphonePermissionState: state + }); + console.log('Saved microphone permission state:', state); + } catch (error) { + console.error('Error saving microphone permission state:', error); + } +} + +/** + * Start recording audio + */ +async function startRecording() { + try { + // Request microphone permission + stream = await navigator.mediaDevices.getUserMedia({ + audio: { + sampleRate: 16000, // Azure Speech SDK works well with 16kHz + channelCount: 1, // Mono + echoCancellation: true, + noiseSuppression: true + } + }); + + // Set up MediaRecorder - try WAV first, fallback to WebM + let options = {}; + let fileExtension = 'webm'; + + // Try WAV format first (best for Azure Speech SDK, no conversion needed) + if (MediaRecorder.isTypeSupported('audio/wav')) { + options.mimeType = 'audio/wav'; + fileExtension = 'wav'; + } + // Try WebM with Opus codec + else if (MediaRecorder.isTypeSupported('audio/webm;codecs=opus')) { + options.mimeType = 'audio/webm;codecs=opus'; + fileExtension = 'webm'; + } + // Fallback to default WebM + else if (MediaRecorder.isTypeSupported('audio/webm')) { + options.mimeType = 'audio/webm'; + fileExtension = 'webm'; + } + + console.log('Using audio format:', options.mimeType || 'default'); + + console.log('Using audio format:', options.mimeType || 'default'); + + mediaRecorder = new MediaRecorder(stream, options); + + // Store the file extension for later use + mediaRecorder.fileExtension = fileExtension; + audioChunks = []; + isCanceling = false; // Reset cancel flag when starting new recording + + mediaRecorder.addEventListener('dataavailable', (event) => { + if (event.data.size > 0) { + console.log('[Recording] Audio chunk received, size:', event.data.size); + audioChunks.push(event.data); + } + }); + + mediaRecorder.addEventListener('stop', handleRecordingStop); + + // Start recording - request data every second for better chunk collection + mediaRecorder.start(1000); // Timeslice: 1000ms + recordingStartTime = Date.now(); + remainingTime = MAX_RECORDING_DURATION; + + console.log('[Recording] Started with 1-second timeslice for better chunk collection'); + + // Reset waveform data + waveformData = []; + + // Show recording UI + showRecordingUI(); + + // Start waveform visualization + startWaveformVisualization(stream); + + // Start countdown timer + startCountdown(); + + // Update permission state to granted + microphonePermissionState = 'granted'; + sessionPermissionRequested = true; + updateMicrophoneIconState('granted'); + + // Save state if preference is 'remember' + if (userMicrophonePreference === 'remember') { + await savePermissionState('granted'); + } + + } catch (error) { + console.error('Error starting recording:', error); + + if (error.name === 'NotAllowedError' || error.name === 'PermissionDeniedError') { + microphonePermissionState = 'denied'; + sessionPermissionRequested = true; + updateMicrophoneIconState('denied'); + + // Save state if preference is 'remember' + if (userMicrophonePreference === 'remember') { + await savePermissionState('denied'); + } + + showToast('Microphone permission denied. Click the microphone icon to manage permissions.', 'warning'); + } else { + showToast('Error starting recording: ' + error.message, 'danger'); + } + } +} + +/** + * Stop recording and send for transcription + */ +function stopAndSendRecording() { + if (mediaRecorder && mediaRecorder.state === 'recording') { + const recordingDuration = (Date.now() - recordingStartTime) / 1000; + console.log('[Recording] Stopping recording after', recordingDuration.toFixed(2), 'seconds'); + console.log('[Recording] Total chunks collected so far:', audioChunks.length); + + mediaRecorder.stop(); + + // Stop all tracks + if (stream) { + stream.getTracks().forEach(track => track.stop()); + } + } +} + +/** +/** + * Cancel recording + */ +function cancelRecording() { + // Set cancel flag BEFORE stopping the recorder + isCanceling = true; + + if (mediaRecorder && mediaRecorder.state === 'recording') { + mediaRecorder.stop(); + + // Stop all tracks + if (stream) { + stream.getTracks().forEach(track => track.stop()); + } + } + + // Clear waveform data + waveformData = []; + + // Clear audio chunks + audioChunks = []; + + // Reset UI + hideRecordingUI(); + stopWaveformVisualization(); + stopCountdown(); +} + +/** + * Convert audio blob to WAV format using Web Audio API + * @param {Blob} audioBlob - The audio blob to convert + * @returns {Promise} WAV formatted audio blob + */ +async function convertToWav(audioBlob) { + console.log('Converting audio to WAV format...'); + + // Create audio context + const audioContext = new (window.AudioContext || window.webkitAudioContext)({ + sampleRate: 16000 // 16kHz for Azure Speech SDK + }); + + // Convert blob to array buffer + const arrayBuffer = await audioBlob.arrayBuffer(); + + // Decode audio data + const audioBuffer = await audioContext.decodeAudioData(arrayBuffer); + + console.log('Audio decoded:', { + sampleRate: audioBuffer.sampleRate, + duration: audioBuffer.duration, + channels: audioBuffer.numberOfChannels + }); + + // Get audio data (convert to mono if needed) + let audioData; + if (audioBuffer.numberOfChannels > 1) { + // Mix down to mono + const left = audioBuffer.getChannelData(0); + const right = audioBuffer.getChannelData(1); + audioData = new Float32Array(left.length); + for (let i = 0; i < left.length; i++) { + audioData[i] = (left[i] + right[i]) / 2; + } + } else { + audioData = audioBuffer.getChannelData(0); + } + + // Convert float32 to int16 (WAV PCM format) + const int16Data = new Int16Array(audioData.length); + for (let i = 0; i < audioData.length; i++) { + const s = Math.max(-1, Math.min(1, audioData[i])); + int16Data[i] = s < 0 ? s * 0x8000 : s * 0x7FFF; + } + + // Create WAV file + const wavBlob = createWavBlob(int16Data, audioBuffer.sampleRate); + + console.log('WAV conversion complete:', { + originalSize: audioBlob.size, + wavSize: wavBlob.size, + sampleRate: audioBuffer.sampleRate + }); + + // Close audio context + await audioContext.close(); + + return wavBlob; +} + +/** + * Create a WAV blob from PCM data + * @param {Int16Array} samples - PCM audio samples + * @param {number} sampleRate - Sample rate in Hz + * @returns {Blob} WAV formatted blob + */ +function createWavBlob(samples, sampleRate) { + const buffer = new ArrayBuffer(44 + samples.length * 2); + const view = new DataView(buffer); + + // Write WAV header + const writeString = (offset, string) => { + for (let i = 0; i < string.length; i++) { + view.setUint8(offset + i, string.charCodeAt(i)); + } + }; + + writeString(0, 'RIFF'); + view.setUint32(4, 36 + samples.length * 2, true); + writeString(8, 'WAVE'); + writeString(12, 'fmt '); + view.setUint32(16, 16, true); // fmt chunk size + view.setUint16(20, 1, true); // PCM format + view.setUint16(22, 1, true); // Mono channel + view.setUint32(24, sampleRate, true); + view.setUint32(28, sampleRate * 2, true); // byte rate + view.setUint16(32, 2, true); // block align + view.setUint16(34, 16, true); // bits per sample + writeString(36, 'data'); + view.setUint32(40, samples.length * 2, true); + + // Write PCM data + const offset = 44; + for (let i = 0; i < samples.length; i++) { + view.setInt16(offset + i * 2, samples[i], true); + } + + return new Blob([buffer], { type: 'audio/wav' }); +} + +/** + * Handle recording stop event + */ +async function handleRecordingStop() { + if (isCanceling) { + console.log('Recording canceled by user'); + hideRecordingUI(); + isCanceling = false; // Reset flag + return; + } + + // Check if recording was canceled + stopWaveformVisualization(); + stopCountdown(); + + // Check if recording was canceled (no chunks) + if (audioChunks.length === 0) { + hideRecordingUI(); + return; + } + + // Get the MIME type from the MediaRecorder + const mimeType = mediaRecorder && mediaRecorder.mimeType ? mediaRecorder.mimeType : 'audio/webm'; + + // Create blob from chunks with correct MIME type + const originalBlob = new Blob(audioChunks, { type: mimeType }); + + console.log('Original audio blob created:', { type: mimeType, size: originalBlob.size }); + + // Show processing state + const sendBtn = document.getElementById('send-recording-btn'); + const cancelBtn = document.getElementById('cancel-recording-btn'); + + if (sendBtn) { + sendBtn.disabled = true; + sendBtn.innerHTML = ''; + } + + if (cancelBtn) { + cancelBtn.disabled = true; + } + + try { + // Convert to WAV format for Azure Speech SDK compatibility + const wavBlob = await convertToWav(originalBlob); + + console.log('[Recording] WAV conversion complete, sending to backend'); + + // Update button text - keep same spinner + if (sendBtn) { + sendBtn.innerHTML = ''; + } + + // Send to backend for transcription + const formData = new FormData(); + formData.append('audio', wavBlob, 'recording.wav'); + + console.log('[Recording] Sending WAV audio to backend, size:', wavBlob.size); + + const response = await fetch('/api/speech/transcribe-chat', { + method: 'POST', + body: formData + }); + + const result = await response.json(); + + if (result.success && result.text) { + // Append transcribed text to existing input + const userInput = document.getElementById('user-input'); + if (userInput) { + console.log('[Speech Input] Transcription successful:', result.text); + + // Check if there's existing text + const existingText = userInput.value.trim(); + + if (existingText) { + // Append with newline separator + userInput.value = existingText + '\n' + result.text; + } else { + // No existing text, just set the transcription + userInput.value = result.text; + } + + console.log('[Speech Input] User input updated, value length:', userInput.value.length); + + // Adjust textarea height + userInput.style.height = ''; + userInput.style.height = Math.min(userInput.scrollHeight, 200) + 'px'; + + // Trigger input change to show send button + if (window.handleInputChange) { + window.handleInputChange(); + } + } + + showToast('Voice message transcribed successfully', 'success'); + + console.log('[Speech Input] Starting auto-send countdown...'); + // Start auto-send countdown + startAutoSendCountdown(); + } else { + showToast(result.error || 'Failed to transcribe audio', 'danger'); + } + + } catch (error) { + console.error('Error transcribing audio:', error); + showToast('Error transcribing audio: ' + error.message, 'danger'); + } finally { + // Reset UI + hideRecordingUI(); + + if (sendBtn) { + sendBtn.disabled = false; + sendBtn.innerHTML = ''; + } + + if (cancelBtn) { + cancelBtn.disabled = false; + } + } +} + +/** + * Show recording UI and hide normal input + */ +function showRecordingUI() { + const normalContainer = document.getElementById('normal-input-container'); + const recordingContainer = document.getElementById('recording-container'); + + if (normalContainer) { + normalContainer.style.display = 'none'; + } + + if (recordingContainer) { + recordingContainer.style.display = 'block'; + } +} + +/** + * Hide recording UI and show normal input + */ +function hideRecordingUI() { + const normalContainer = document.getElementById('normal-input-container'); + const recordingContainer = document.getElementById('recording-container'); + + if (normalContainer) { + normalContainer.style.display = 'block'; + } + + if (recordingContainer) { + recordingContainer.style.display = 'none'; + } +} + +/** + * Start waveform visualization + */ +function startWaveformVisualization(audioStream) { + const canvas = document.getElementById('waveform-canvas'); + if (!canvas) return; + + const canvasCtx = canvas.getContext('2d'); + + // Set canvas size - height is now 36px to match buttons + canvas.width = canvas.offsetWidth; + canvas.height = 36; + + // Create audio context and analyser + const AudioContext = window.AudioContext || window.webkitAudioContext; + audioContext = new AudioContext(); + analyser = audioContext.createAnalyser(); + analyser.fftSize = 256; + + const source = audioContext.createMediaStreamSource(audioStream); + source.connect(analyser); + + const bufferLength = analyser.frequencyBinCount; + const dataArray = new Uint8Array(bufferLength); + + // Draw function + function draw() { + animationFrame = requestAnimationFrame(draw); + + analyser.getByteFrequencyData(dataArray); + + // Calculate average amplitude for this frame + let sum = 0; + for (let i = 0; i < bufferLength; i++) { + sum += dataArray[i]; + } + const avgAmplitude = sum / bufferLength / 255; // Normalize to 0-1 + + // Store amplitude for this frame (keep as 0-1, we'll handle centering in drawing) + waveformData.push(avgAmplitude); + + // Calculate progress (how much of the recording time has elapsed) + const elapsed = Date.now() - recordingStartTime; + const elapsedSeconds = elapsed / 1000; + + // Check if we've hit the time limit FIRST (before clamping progress) + if (elapsedSeconds >= MAX_RECORDING_DURATION) { + console.log('[Recording] Time limit reached at', elapsedSeconds.toFixed(2), 'seconds, auto-stopping...'); + stopAndSendRecording(); + return; // Stop the animation loop + } + + const progress = Math.min(elapsed / (MAX_RECORDING_DURATION * 1000), 1); + const progressWidth = canvas.width * progress; + + // Check if dark mode is active + const isDarkMode = document.documentElement.getAttribute('data-bs-theme') === 'dark'; + + // Clear canvas with appropriate background color + canvasCtx.fillStyle = isDarkMode ? '#343a40' : '#f8f9fa'; + canvasCtx.fillRect(0, 0, canvas.width, canvas.height); + + // Draw unfilled area (dashed line at center) + canvasCtx.setLineDash([5, 5]); + canvasCtx.strokeStyle = isDarkMode ? '#495057' : '#dee2e6'; + canvasCtx.lineWidth = 1; + canvasCtx.beginPath(); + canvasCtx.moveTo(progressWidth, canvas.height / 2); + canvasCtx.lineTo(canvas.width, canvas.height / 2); + canvasCtx.stroke(); + canvasCtx.setLineDash([]); + + // Draw recorded waveform (filled area) - vertical bars + if (waveformData.length > 1) { + const centerY = canvas.height / 2; + const maxBarHeight = canvas.height * 1.95; // Bars can extend 48% of canvas height in each direction (96% total) + const barSpacing = 3; // Pixels between bars + const pointsToShow = Math.floor(progressWidth / barSpacing); + const step = waveformData.length / pointsToShow; + + // Determine waveform color based on progress + let waveformColor = '#0d6efd'; // Default blue + if (progress >= 0.95) { + waveformColor = '#dc3545'; // Red + } else if (progress >= 0.85) { + waveformColor = '#ffc107'; // Yellow + } + + canvasCtx.lineWidth = 2; + canvasCtx.strokeStyle = waveformColor; + + for (let i = 0; i < pointsToShow && i < waveformData.length; i++) { + const dataIndex = Math.floor(i * step); + const amplitude = waveformData[dataIndex]; + const x = i * barSpacing; + + // Draw vertical bar from center, extending both up and down + const barHeight = amplitude * maxBarHeight; + + canvasCtx.beginPath(); + canvasCtx.moveTo(x, centerY - barHeight); + canvasCtx.lineTo(x, centerY + barHeight); + canvasCtx.stroke(); + } + } + } + + draw(); +} + +/** + * Stop waveform visualization + */ +function stopWaveformVisualization() { + if (animationFrame) { + cancelAnimationFrame(animationFrame); + animationFrame = null; + } + + if (audioContext) { + audioContext.close(); + audioContext = null; + } + + analyser = null; +} + +/** + * Start countdown timer (progress bar) + */ +function startCountdown() { + const timerBar = document.getElementById('recording-timer-bar'); + if (!timerBar) return; + + const startTime = Date.now(); + const duration = MAX_RECORDING_DURATION * 1000; // Convert to milliseconds + + const updateProgress = () => { + const elapsed = Date.now() - startTime; + const remaining = duration - elapsed; + + if (remaining <= 0) { + // Time's up - auto stop recording + remainingTime = 0; + stopAndSendRecording(); + } else { + // Calculate percentage remaining based on actual elapsed time + const percentRemaining = (remaining / duration) * 100; + remainingTime = Math.ceil(remaining / 1000); + + // Update bar width using CSS variable + document.documentElement.style.setProperty('--recording-timer-width', percentRemaining + '%'); + + // Change color classes when time is running out + timerBar.classList.remove('warning', 'danger'); + if (percentRemaining <= 10) { + timerBar.classList.add('danger'); + } else if (percentRemaining <= 30) { + timerBar.classList.add('warning'); + } + + // Continue animation + countdownInterval = requestAnimationFrame(updateProgress); + } + }; + + // Start the animation loop + countdownInterval = requestAnimationFrame(updateProgress); +} + +/** + * Stop countdown timer + */ +function stopCountdown() { + if (countdownInterval) { + cancelAnimationFrame(countdownInterval); + countdownInterval = null; + } + + const timerBar = document.getElementById('recording-timer-bar'); + if (timerBar) { + document.documentElement.style.setProperty('--recording-timer-width', '100%'); + timerBar.classList.remove('warning', 'danger'); + } + + remainingTime = MAX_RECORDING_DURATION; +} + +/** + * Start auto-send countdown after transcription + */ +function startAutoSendCountdown() { + console.log('[Auto-Send] Starting countdown...'); + + const totalCountdown = 5; // seconds + let countdown = totalCountdown; + const sendBtn = document.getElementById('send-btn'); + + if (!sendBtn) { + console.error('[Auto-Send] Send button not found!'); + return; + } + + console.log('[Auto-Send] Send button found, current conversation ID:', window.currentConversationId || 'NEW'); + + // Store original button state + const originalHTML = sendBtn.innerHTML; + const originalDisabled = sendBtn.disabled; + + // Add a progress background element + const progressBg = document.createElement('div'); + progressBg.style.cssText = ` + position: absolute; + top: 0; + left: 0; + height: 100%; + width: 0%; + background: linear-gradient(90deg, #0d6efd, #0dcaf0); + border-radius: 0.375rem; + transition: width 0.1s linear; + z-index: -1; + `; + sendBtn.style.position = 'relative'; + sendBtn.style.overflow = 'hidden'; + sendBtn.appendChild(progressBg); + + // Update button appearance for countdown mode + sendBtn.style.color = 'white'; + sendBtn.classList.add('btn-primary'); + sendBtn.classList.remove('btn-warning'); + + // Click handler to cancel auto-send + const cancelAutoSend = (event) => { + // Prevent default action and stop event propagation + event.preventDefault(); + event.stopPropagation(); + event.stopImmediatePropagation(); + + console.log('[Auto-Send] Cancelled by user'); + clearAutoSend(); + + // Remove progress background + if (progressBg.parentNode) { + progressBg.remove(); + } + + sendBtn.innerHTML = originalHTML; + sendBtn.disabled = originalDisabled; + sendBtn.style.color = ''; + sendBtn.classList.remove('btn-warning'); + sendBtn.classList.add('btn-primary'); + sendBtn.removeEventListener('click', cancelAutoSend, true); + showToast('Auto-send cancelled. Click Send when ready.', 'info'); + }; + + // Add event listener with capture phase to intercept before other handlers + sendBtn.addEventListener('click', cancelAutoSend, true); + + // Animation frame for smooth progress + const startTime = Date.now(); + const duration = totalCountdown * 1000; // milliseconds + + const updateProgress = () => { + const elapsed = Date.now() - startTime; + const progress = Math.min(elapsed / duration, 1); + const percentage = progress * 100; + + // Update progress background width + progressBg.style.width = percentage + '%'; + + if (progress < 1) { + autoSendCountdown = requestAnimationFrame(updateProgress); + } else { + // Countdown complete - send immediately + console.log('[Auto-Send] ===== COUNTDOWN COMPLETE ====='); + console.log('[Auto-Send] Current conversation ID:', window.currentConversationId || 'NEW'); + console.log('[Auto-Send] User input value:', document.getElementById('user-input')?.value); + console.log('[Auto-Send] Chatbox children count:', document.getElementById('chatbox')?.children.length); + + // Remove progress background + if (progressBg.parentNode) { + progressBg.remove(); + } + + // Restore button to original state + sendBtn.innerHTML = originalHTML; + sendBtn.disabled = originalDisabled; + sendBtn.style.color = ''; + sendBtn.classList.remove('btn-warning', 'auto-sending'); + sendBtn.classList.add('btn-primary'); + + // Remove the cancel listener + sendBtn.removeEventListener('click', cancelAutoSend, true); + + // Clear the auto-send state + autoSendCountdown = null; + autoSendTimeout = null; + + console.log('[Auto-Send] About to trigger click...'); + // Trigger the send by programmatically clicking the button + // This ensures all normal send handlers fire + requestAnimationFrame(() => { + console.log('[Auto-Send] Clicking send button NOW'); + sendBtn.click(); + console.log('[Auto-Send] Click triggered, conversation ID after:', window.currentConversationId || 'NEW'); + }); + } + }; + + // Start the animation + autoSendCountdown = requestAnimationFrame(updateProgress); + + // Also store timeout reference for cleanup + autoSendTimeout = autoSendCountdown; +} + +/** + * Clear auto-send countdown + */ +function clearAutoSend() { + if (autoSendCountdown) { + cancelAnimationFrame(autoSendCountdown); + autoSendCountdown = null; + } + + if (autoSendTimeout) { + clearTimeout(autoSendTimeout); + autoSendTimeout = null; + } +} + diff --git a/application/single_app/static/js/chat/chat-streaming.js b/application/single_app/static/js/chat/chat-streaming.js index 4092bb84..1519890a 100644 --- a/application/single_app/static/js/chat/chat-streaming.js +++ b/application/single_app/static/js/chat/chat-streaming.js @@ -61,13 +61,28 @@ function updateStreamingButtonState() { const streamingToggleBtn = document.getElementById('streaming-toggle-btn'); if (!streamingToggleBtn) return; - if (streamingEnabled) { - streamingToggleBtn.classList.remove('btn-outline-secondary'); + // Check if TTS autoplay is enabled + let ttsAutoplayEnabled = false; + if (typeof window.appSettings !== 'undefined' && window.appSettings.enable_text_to_speech) { + const cachedSettings = JSON.parse(localStorage.getItem('userSettings') || '{}'); + ttsAutoplayEnabled = cachedSettings.settings?.ttsAutoplay === true; + } + + if (ttsAutoplayEnabled) { + // Disable streaming button when TTS autoplay is on + streamingToggleBtn.classList.remove('btn-primary'); + streamingToggleBtn.classList.add('btn-outline-secondary', 'disabled'); + streamingToggleBtn.disabled = true; + streamingToggleBtn.title = 'Streaming disabled - TTS autoplay is enabled. Disable TTS autoplay in your profile to enable streaming.'; + } else if (streamingEnabled) { + streamingToggleBtn.classList.remove('btn-outline-secondary', 'disabled'); streamingToggleBtn.classList.add('btn-primary'); + streamingToggleBtn.disabled = false; streamingToggleBtn.title = 'Streaming enabled - click to disable'; } else { - streamingToggleBtn.classList.remove('btn-primary'); + streamingToggleBtn.classList.remove('btn-primary', 'disabled'); streamingToggleBtn.classList.add('btn-outline-secondary'); + streamingToggleBtn.disabled = false; streamingToggleBtn.title = 'Streaming disabled - click to enable'; } } @@ -86,6 +101,24 @@ function updateStreamingButtonVisibility() { } export function isStreamingEnabled() { + // Check if TTS autoplay is enabled - streaming is incompatible with TTS autoplay + if (typeof window.appSettings !== 'undefined' && window.appSettings.enable_text_to_speech) { + // Dynamically check TTS settings + loadUserSettings().then(settings => { + if (settings.ttsAutoplay === true) { + console.log('TTS autoplay enabled - streaming disabled'); + } + }).catch(error => { + console.error('Error checking TTS settings:', error); + }); + + // Synchronous check using cached value if available + const cachedSettings = JSON.parse(localStorage.getItem('userSettings') || '{}'); + if (cachedSettings.settings?.ttsAutoplay === true) { + return false; // Disable streaming when TTS autoplay is active + } + } + // Check if image generation is active - streaming is incompatible with image gen const imageGenBtn = document.getElementById('image-generate-btn'); if (imageGenBtn && imageGenBtn.classList.contains('active')) { @@ -307,7 +340,8 @@ function finalizeStreamingMessage(messageId, userMessageId, finalData) { finalData.agent_citations || [], finalData.agent_display_name || null, finalData.agent_name || null, - null + null, + true // isNewMessage - trigger autoplay for new streaming responses ); // Update conversation if needed @@ -324,8 +358,6 @@ function finalizeStreamingMessage(messageId, userMessageId, finalData) { // Update sidebar conversation title in real-time updateSidebarConversationTitle(finalData.conversation_id, finalData.conversation_title); } - - showToast('Response complete', 'success'); } export function cancelStreaming() { diff --git a/application/single_app/static/js/chat/chat-tts.js b/application/single_app/static/js/chat/chat-tts.js new file mode 100644 index 00000000..23a48c70 --- /dev/null +++ b/application/single_app/static/js/chat/chat-tts.js @@ -0,0 +1,1056 @@ +// chat-tts.js - Text-to-Speech functionality for chat messages + +import { showToast } from './chat-toast.js'; + +// TTS State Management +let ttsEnabled = false; +let ttsAutoplay = false; +let ttsVoice = 'en-US-Andrew:DragonHDLatestNeural'; +let ttsSpeed = 1.0; +let currentPlayingAudio = null; +let currentPlayingMessageId = null; +let audioQueue = []; // Queue for chunked audio playback +let isQueueing = false; // Track if we're still loading chunks +let wordHighlightInterval = null; // Track word highlighting interval +let currentWordIndex = 0; // Current word being highlighted +let totalWords = 0; // Total words in current chunk +let wordOffset = 0; // Starting word index for current chunk +let highlightState = null; // Store state for pause/resume: { messageId, chunkText, duration, startWordIndex, msPerWord } + +// Audio visualization +let audioContext = null; +let analyser = null; +let volumeCheckInterval = null; +let currentAudioSource = null; + +/** + * Initialize TTS settings from user preferences + */ +export async function initializeTTS() { + try { + const response = await fetch('/api/user/settings'); + if (!response.ok) { + throw new Error('Failed to load user settings'); + } + + const data = await response.json(); + const settings = data.settings || {}; + + ttsEnabled = settings.ttsEnabled || false; + ttsAutoplay = settings.ttsAutoplay || false; + ttsVoice = settings.ttsVoice || 'en-US-Andrew:DragonHDLatestNeural'; + ttsSpeed = settings.ttsSpeed || 1.0; + + console.log('TTS initialized:', { ttsEnabled, ttsAutoplay, ttsVoice, ttsSpeed }); + + // Update button state after loading settings + updateAutoplayButton(); + + } catch (error) { + console.error('Error initializing TTS:', error); + } +} + +/** + * Check if TTS is enabled + */ +export function isTTSEnabled() { + return ttsEnabled; +} + +/** + * Check if TTS autoplay is enabled + */ +export function isTTSAutoplayEnabled() { + return ttsAutoplay; +} + +/** + * Play text-to-speech for a message with chunked delivery for faster start + */ +export async function playTTS(messageId, text) { + // Stop any currently playing audio + stopTTS(); + + if (!text || text.trim() === '') { + showToast('No text to read', 'warning'); + return; + } + + try { + // Update button to show loading state + updateTTSButton(messageId, 'loading'); + + // Strip HTML tags and get plain text + const tempDiv = document.createElement('div'); + tempDiv.innerHTML = text; + const plainText = tempDiv.textContent || tempDiv.innerText || ''; + + // Split text into word-based chunks + // Group 1: Progressive chunks (10, 15, 20, 25, 30 words) + // Group 2+: Remaining in 40-word chunks + const words = plainText.split(/\s+/); + const chunks = []; + + let index = 0; + + // Group 1: Progressive chunks + if (words.length > index) { + chunks.push(words.slice(index, index + 10).join(' ')); + index += 10; + } + if (words.length > index) { + chunks.push(words.slice(index, index + 15).join(' ')); + index += 15; + } + if (words.length > index) { + chunks.push(words.slice(index, index + 20).join(' ')); + index += 20; + } + if (words.length > index) { + chunks.push(words.slice(index, index + 25).join(' ')); + index += 25; + } + if (words.length > index) { + chunks.push(words.slice(index, index + 30).join(' ')); + index += 30; + } + + // Group 2+: Remaining words in 40-word chunks + while (index < words.length) { + chunks.push(words.slice(index, index + 40).join(' ')); + index += 40; + } + + console.log(`[TTS] Split into ${chunks.length} chunks:`, chunks.map(c => `${c.split(/\s+/).length} words`)); + + // Synthesize chunks 1 and 2 in parallel + const firstChunk = chunks.shift(); + const secondChunk = chunks.length > 0 ? chunks.shift() : null; + + console.log('[TTS] Synthesizing chunks 1 and 2 in parallel...'); + const parallelPromises = [synthesizeChunk(firstChunk, messageId)]; + if (secondChunk) { + parallelPromises.push(synthesizeChunk(secondChunk, messageId)); + } + + const [firstAudio, secondAudio] = await Promise.all(parallelPromises); + if (!firstAudio) return; + + // Track word offsets for each chunk + let currentWordOffset = 0; + const firstChunkWordCount = firstChunk.trim().split(/\s+/).length; + + // Queue chunk 2 immediately (it's already synthesized) + if (secondChunk && secondAudio) { + const secondChunkWordCount = secondChunk.trim().split(/\s+/).length; + audioQueue.push({ + audio: secondAudio, + url: secondAudio.src, + text: secondChunk, + wordOffset: firstChunkWordCount // Start after first chunk's words + }); + console.log('[TTS] Chunk 2 pre-queued, ready to play after chunk 1'); + } + + // Start playing first chunk + console.log('[TTS] Playing chunk 1 immediately'); + currentPlayingAudio = firstAudio; + currentPlayingMessageId = messageId; + + // Setup audio event handlers + currentPlayingAudio.onloadedmetadata = () => { + // Audio metadata loaded, duration is now available + const duration = currentPlayingAudio.duration; + startWordHighlighting(messageId, firstChunk, duration, 0); // Start at word 0 + }; + + // If metadata is already loaded, start highlighting immediately + if (currentPlayingAudio.duration && !isNaN(currentPlayingAudio.duration)) { + const duration = currentPlayingAudio.duration; + startWordHighlighting(messageId, firstChunk, duration, 0); + } + + currentPlayingAudio.onpause = () => { + updateTTSButton(messageId, 'paused'); + console.log('[TTS] Audio paused event fired'); + pauseWordHighlighting(); + }; + + currentPlayingAudio.onplay = () => { + console.log('[TTS] Audio play event fired, highlightState exists:', !!highlightState, 'interval is null:', wordHighlightInterval === null); + updateTTSButton(messageId, 'playing'); + highlightPlayingMessage(messageId, true); + // Resume word highlighting if we were paused (highlightState exists but no active interval) + if (highlightState && wordHighlightInterval === null) { + console.log('[TTS] Resuming from pause'); + resumeWordHighlighting(); + } + }; + + currentPlayingAudio.onended = () => { + // Play next chunk from queue if available + playNextChunk(messageId); + }; + + currentPlayingAudio.onerror = (error) => { + console.error('Audio playback error:', error); + showToast('Error playing audio', 'danger'); + updateTTSButton(messageId, 'stopped'); + highlightPlayingMessage(messageId, false); + currentPlayingAudio = null; + currentPlayingMessageId = null; + audioQueue = []; + }; + + // Start playback of first chunk + await currentPlayingAudio.play(); + + // Synthesize remaining chunks in groups while audio is playing + if (chunks.length > 0) { + isQueueing = true; + // Calculate starting word offset for remaining chunks (after chunks 1 and 2) + const firstChunkWords = firstChunk.trim().split(/\s+/).length; + const secondChunkWords = secondChunk ? secondChunk.trim().split(/\s+/).length : 0; + const startingOffset = firstChunkWords + secondChunkWords; + + queueChunksInGroups(chunks, messageId, startingOffset).then(() => { + isQueueing = false; + console.log(`[TTS] All chunks queued successfully`); + }).catch(error => { + console.error('[TTS] Error queueing chunks:', error); + isQueueing = false; + }); + } else { + console.log('[TTS] No remaining chunks - single chunk playback'); + } + + } catch (error) { + console.error('Error playing TTS:', error); + showToast(`TTS Error: ${error.message}`, 'danger'); + updateTTSButton(messageId, 'stopped'); + currentPlayingAudio = null; + currentPlayingMessageId = null; + audioQueue = []; + isQueueing = false; + } +} + +/** + * Synthesize a text chunk and return Audio element + */ +async function synthesizeChunk(text, messageId) { + try { + const response = await fetch('/api/chat/tts', { + method: 'POST', + headers: { + 'Content-Type': 'application/json' + }, + body: JSON.stringify({ + text: text, + voice: ttsVoice, + speed: ttsSpeed + }) + }); + + if (!response.ok) { + const errorData = await response.json(); + throw new Error(errorData.error || 'Failed to generate speech'); + } + + // Get audio blob + const audioBlob = await response.blob(); + const audioUrl = URL.createObjectURL(audioBlob); + + return new Audio(audioUrl); + + } catch (error) { + console.error('Error synthesizing chunk:', error); + throw error; + } +} + +/** + * Queue chunks in groups with parallel synthesis: + * - Group 1: Chunks 3-7 all in parallel (5 chunks) + * - Group 2+: Remaining chunks in batches of 5, all parallel within each batch + */ +async function queueChunksInGroups(chunks, messageId, startingWordOffset = 0) { + console.log(`[TTS] Queueing ${chunks.length} remaining chunks in groups of 5 (parallel within each group)`); + + try { + let groupNum = 1; + let chunkNumOffset = 3; // Start at chunk 3 since chunks 1 and 2 are already handled + let currentWordOffset = startingWordOffset; + + while (chunks.length > 0) { + // Take up to 5 chunks for this group + const groupSize = Math.min(5, chunks.length); + const groupChunks = chunks.splice(0, groupSize); + + console.log(`[TTS] Group ${groupNum}: Synthesizing ${groupSize} chunks in parallel`); + + // Synthesize all chunks in this group in parallel + const synthesisPromises = groupChunks.map((text, index) => { + const chunkNum = chunkNumOffset + index; + const wordCount = text.split(/\s+/).length; + const thisChunkOffset = currentWordOffset; + + // Increment offset for next chunk + currentWordOffset += wordCount; + + console.log(`[TTS] Starting synthesis for chunk ${chunkNum} (${wordCount} words, offset: ${thisChunkOffset})`); + return synthesizeChunk(text, messageId).then(audio => ({ + chunkNum: chunkNum, + audio: audio, + url: audio ? audio.src : null, + text: text, + wordOffset: thisChunkOffset + })); + }); + + // Wait for all chunks in this group to complete + const results = await Promise.all(synthesisPromises); + + // Add to queue in order + results.forEach(result => { + if (result.audio) { + audioQueue.push({ + audio: result.audio, + url: result.url, + text: result.text, + wordOffset: result.wordOffset + }); + console.log(`[TTS] Chunk ${result.chunkNum} queued (${result.text.split(/\s+/).length} words, offset: ${result.wordOffset}), queue size: ${audioQueue.length}`); + } + }); + + console.log(`[TTS] Group ${groupNum} complete, ${chunks.length} chunks remaining`); + chunkNumOffset += groupSize; + groupNum++; + } + + console.log(`[TTS] All ${groupNum - 1} groups complete, total queue size: ${audioQueue.length}`); + + } catch (error) { + console.error('[TTS] Error in group queueing:', error); + throw error; + } +} + +/** + * Queue multiple text chunks for background synthesis (in parallel) + */ +async function queueMultipleChunks(chunks, messageId) { + console.log(`[TTS] Queueing ${chunks.length} chunks in parallel`); + + try { + // Start all syntheses in parallel + const synthesisPromises = chunks.map((text, index) => { + console.log(`[TTS] Starting synthesis for chunk ${index + 1}/${chunks.length}: ${text.split(/\s+/).length} words`); + return synthesizeChunk(text, messageId).then(audio => ({ + index: index, + audio: audio, + url: audio.src, + text: text + })); + }); + + // Wait for all to complete + const results = await Promise.all(synthesisPromises); + + // Sort by original order (in case they complete out of order) + results.sort((a, b) => a.index - b.index); + + // Add to queue in correct order + results.forEach((result, i) => { + audioQueue.push({ + audio: result.audio, + url: result.url, + text: result.text + }); + console.log(`[TTS] Queued chunk ${i + 1}: ${result.text.split(/\s+/).length} words, queue size: ${audioQueue.length}`); + }); + + console.log(`[TTS] All ${chunks.length} chunks synthesized and queued in parallel, final queue size: ${audioQueue.length}`); + + } catch (error) { + console.error('[TTS] Error during parallel synthesis:', error); + // Even if some fail, queue whatever succeeded + } +} + +/** + * Play next chunk from queue + */ +function playNextChunk(messageId) { + console.log(`[TTS] playNextChunk called - queue: ${audioQueue.length}, isQueueing: ${isQueueing}`); + + if (audioQueue.length === 0) { + // Check if we're still loading chunks + if (isQueueing) { + console.log('[TTS] Queue empty but still loading chunks, waiting...'); + // Wait a bit and try again + setTimeout(() => playNextChunk(messageId), 100); + return; + } + + // No more chunks, end playback + console.log('[TTS] Playback complete'); + updateTTSButton(messageId, 'stopped'); + highlightPlayingMessage(messageId, false); + currentPlayingAudio = null; + currentPlayingMessageId = null; + return; + } + + // Get next chunk + const nextChunk = audioQueue.shift(); + console.log(`[TTS] Playing next chunk, ${audioQueue.length} remaining in queue`); + + // Cleanup previous audio URL + if (currentPlayingAudio && currentPlayingAudio.src) { + URL.revokeObjectURL(currentPlayingAudio.src); + } + + currentPlayingAudio = nextChunk.audio; + + // Setup handlers for next chunk + currentPlayingAudio.onloadedmetadata = () => { + // Start word highlighting for this chunk when metadata is loaded + const duration = currentPlayingAudio.duration; + const chunkText = nextChunk.text || ''; + const wordOffset = nextChunk.wordOffset || 0; + startWordHighlighting(messageId, chunkText, duration, wordOffset); + }; + + // If metadata is already loaded, start highlighting immediately + if (currentPlayingAudio.duration && !isNaN(currentPlayingAudio.duration)) { + const duration = currentPlayingAudio.duration; + const chunkText = nextChunk.text || ''; + const wordOffset = nextChunk.wordOffset || 0; + startWordHighlighting(messageId, chunkText, duration, wordOffset); + } + + currentPlayingAudio.onpause = () => { + // Audio paused - pause word highlighting + console.log('[TTS] Chunk audio paused event fired'); + pauseWordHighlighting(); + }; + + currentPlayingAudio.onplay = () => { + // Audio playing/resumed - resume word highlighting and restart visualization + console.log('[TTS] Chunk audio play event fired, highlightState exists:', !!highlightState, 'interval is null:', wordHighlightInterval === null); + + // Restart audio visualization for new chunk + startAudioVisualization(messageId); + + if (highlightState && wordHighlightInterval === null) { + console.log('[TTS] Resuming from pause in chunk'); + resumeWordHighlighting(); + } + }; + + currentPlayingAudio.onended = () => { + URL.revokeObjectURL(nextChunk.url); + playNextChunk(messageId); + }; + + currentPlayingAudio.onerror = (error) => { + console.error('Error playing queued chunk:', error); + URL.revokeObjectURL(nextChunk.url); + playNextChunk(messageId); // Try next chunk + }; + + // Play next chunk + currentPlayingAudio.play().catch(error => { + console.error('Error starting next chunk:', error); + playNextChunk(messageId); // Try next chunk + }); +} + + +/** + * Stop currently playing TTS + */ +export function stopTTS() { + if (currentPlayingAudio) { + currentPlayingAudio.pause(); + currentPlayingAudio = null; + + if (currentPlayingMessageId) { + updateTTSButton(currentPlayingMessageId, 'stopped'); + highlightPlayingMessage(currentPlayingMessageId, false); + currentPlayingMessageId = null; + } + } + + // Clear audio queue and revoke URLs + audioQueue.forEach(chunk => { + if (chunk.url) { + URL.revokeObjectURL(chunk.url); + } + }); + audioQueue = []; + isQueueing = false; +} + +/** + * Pause currently playing TTS + */ +export function pauseTTS() { + if (currentPlayingAudio && !currentPlayingAudio.paused) { + currentPlayingAudio.pause(); + if (currentPlayingMessageId) { + updateTTSButton(currentPlayingMessageId, 'paused'); + } + } +} + +/** + * Resume paused TTS + */ +export function resumeTTS() { + if (currentPlayingAudio && currentPlayingAudio.paused) { + currentPlayingAudio.play(); + if (currentPlayingMessageId) { + updateTTSButton(currentPlayingMessageId, 'playing'); + } + } +} + +/** + * Update TTS button state + */ +function updateTTSButton(messageId, state) { + const button = document.querySelector(`[data-message-id="${messageId}"] .tts-play-btn`); + if (!button) { + console.log('[TTS] Button not found for message:', messageId); + return; + } + + const icon = button.querySelector('i'); + if (!icon) { + console.log('[TTS] Icon not found in button for message:', messageId); + return; + } + + // Remove all state classes + icon.classList.remove('bi-volume-up', 'bi-pause-fill', 'bi-stop-fill'); + button.classList.remove('btn-primary', 'btn-success', 'btn-warning'); + button.disabled = false; + + switch (state) { + case 'loading': + icon.className = 'bi bi-hourglass-split'; + button.disabled = true; + button.title = 'One moment, I’m taking a look'; + break; + + case 'playing': + icon.className = 'bi bi-pause-fill'; + button.classList.add('btn-success'); + button.title = 'Hold on, pause what you are reading'; + break; + + case 'paused': + icon.className = 'bi bi-volume-up'; + button.classList.add('btn-warning'); + button.title = 'Go ahead, continue reading'; + break; + + case 'stopped': + default: + icon.className = 'bi bi-volume-up'; + button.title = 'Read this to me'; + break; + } +} + +/** + * Highlight message being read + */ +/** + * Prepare message text for word-by-word highlighting + */ +function prepareMessageForHighlighting(messageId) { + const messageElement = document.querySelector(`[data-message-id="${messageId}"]`); + if (!messageElement) return; + + const messageTextDiv = messageElement.querySelector('.message-text'); + if (!messageTextDiv || messageTextDiv.dataset.ttsWrapped === 'true') return; + + // Function to wrap words in text nodes only, not HTML + function wrapWordsInTextNodes(node) { + if (node.nodeType === Node.TEXT_NODE) { + // This is a text node - wrap its words + const text = node.textContent; + if (text.trim().length === 0) return; // Skip whitespace-only nodes + + const words = text.split(/(\s+)/); // Split but keep whitespace + const fragment = document.createDocumentFragment(); + + words.forEach(word => { + if (/\S/.test(word)) { + // Non-whitespace word - wrap it + const span = document.createElement('span'); + span.className = 'tts-word'; + span.textContent = word; + fragment.appendChild(span); + } else { + // Whitespace - keep as text + fragment.appendChild(document.createTextNode(word)); + } + }); + + node.parentNode.replaceChild(fragment, node); + } else if (node.nodeType === Node.ELEMENT_NODE) { + // This is an element - recurse into its children + // Convert to array to avoid live NodeList issues + Array.from(node.childNodes).forEach(child => wrapWordsInTextNodes(child)); + } + } + + wrapWordsInTextNodes(messageTextDiv); + messageTextDiv.dataset.ttsWrapped = 'true'; +} + +/** + * Start highlighting words progressively during playback + */ +function startWordHighlighting(messageId, chunkText, duration, startWordIndex = 0) { + // Clear any existing highlighting + stopWordHighlighting(); + + // Validate duration + if (!duration || duration === 0 || isNaN(duration)) { + console.log('[TTS] Invalid duration for word highlighting, skipping'); + return; + } + + // Prepare message for highlighting if not already done + prepareMessageForHighlighting(messageId); + + const messageElement = document.querySelector(`[data-message-id="${messageId}"]`); + if (!messageElement) return; + + const allWordElements = messageElement.querySelectorAll('.tts-word'); + if (allWordElements.length === 0) return; + + // Count words in this chunk + const chunkWords = chunkText.trim().split(/\s+/).length; + + // Calculate which words to highlight for this chunk + wordOffset = startWordIndex; + totalWords = Math.min(chunkWords, allWordElements.length - wordOffset); + currentWordIndex = 0; + + if (totalWords <= 0) { + console.log('[TTS] No words to highlight for this chunk'); + return; + } + + // Calculate time per word (in milliseconds) + const msPerWord = (duration * 1000) / totalWords; + + // Store state for pause/resume + highlightState = { + messageId: messageId, + chunkText: chunkText, + duration: duration, + startWordIndex: startWordIndex, + msPerWord: msPerWord, + allWordElements: allWordElements + }; + + console.log(`[TTS] Word highlighting: chunk has ${chunkWords} words, highlighting words ${wordOffset} to ${wordOffset + totalWords - 1}, ${duration.toFixed(2)}s duration, ${msPerWord.toFixed(0)}ms per word`); + + // Highlight first word immediately + const firstWordIndex = wordOffset; + if (firstWordIndex < allWordElements.length) { + allWordElements[firstWordIndex].classList.add('tts-current-word'); + } + + // Set interval to highlight next words + wordHighlightInterval = setInterval(() => { + // Check if audio is paused - if so, stop highlighting + if (currentPlayingAudio && currentPlayingAudio.paused) { + console.log('[TTS] Audio paused, stopping word highlight interval'); + pauseWordHighlighting(); + return; + } + + // Remove highlight from previous word + const prevIndex = wordOffset + currentWordIndex; + if (prevIndex < allWordElements.length) { + allWordElements[prevIndex].classList.remove('tts-current-word'); + } + + currentWordIndex++; + + // Add highlight to current word + const nextIndex = wordOffset + currentWordIndex; + if (currentWordIndex < totalWords && nextIndex < allWordElements.length) { + allWordElements[nextIndex].classList.add('tts-current-word'); + } else { + // Reached the end of this chunk, clear interval + stopWordHighlighting(); + } + }, msPerWord); +} + +/** + * Pause word highlighting (keep state for resume) + */ +function pauseWordHighlighting() { + console.log('[TTS] Pausing word highlighting, currentWordIndex:', currentWordIndex); + if (wordHighlightInterval) { + clearInterval(wordHighlightInterval); + wordHighlightInterval = null; + } + // Keep currentWordIndex, totalWords, wordOffset, and highlightState for resume +} + +/** + * Resume word highlighting from current audio position + */ +function resumeWordHighlighting() { + if (!highlightState || !currentPlayingAudio) return; + + const { messageId, msPerWord, allWordElements } = highlightState; + + // Calculate current word position based on audio time + const elapsedTime = currentPlayingAudio.currentTime * 1000; // Convert to ms + const calculatedWordIndex = Math.floor(elapsedTime / msPerWord); + + // Update currentWordIndex to match audio position + currentWordIndex = Math.min(calculatedWordIndex, totalWords - 1); + + console.log(`[TTS] Resuming word highlighting from word ${currentWordIndex} (audio time: ${currentPlayingAudio.currentTime.toFixed(2)}s)`); + + // Highlight current word + const currentIndex = wordOffset + currentWordIndex; + if (currentIndex < allWordElements.length) { + allWordElements[currentIndex].classList.add('tts-current-word'); + } + + // Continue highlighting from this point + wordHighlightInterval = setInterval(() => { + // Check if audio is paused - if so, stop highlighting + if (currentPlayingAudio && currentPlayingAudio.paused) { + console.log('[TTS] Audio paused during resume, stopping word highlight interval'); + pauseWordHighlighting(); + return; + } + + // Remove highlight from previous word + const prevIndex = wordOffset + currentWordIndex; + if (prevIndex < allWordElements.length) { + allWordElements[prevIndex].classList.remove('tts-current-word'); + } + + currentWordIndex++; + + // Add highlight to current word + const nextIndex = wordOffset + currentWordIndex; + if (currentWordIndex < totalWords && nextIndex < allWordElements.length) { + allWordElements[nextIndex].classList.add('tts-current-word'); + } else { + // Reached the end of this chunk, clear interval + stopWordHighlighting(); + } + }, msPerWord); +} + +/** + * Stop word highlighting + */ +function stopWordHighlighting() { + if (wordHighlightInterval) { + clearInterval(wordHighlightInterval); + wordHighlightInterval = null; + } + + // Remove all word highlights + if (currentPlayingMessageId) { + const messageElement = document.querySelector(`[data-message-id="${currentPlayingMessageId}"]`); + if (messageElement) { + const wordElements = messageElement.querySelectorAll('.tts-word'); + wordElements.forEach(word => word.classList.remove('tts-current-word')); + } + } + + currentWordIndex = 0; + totalWords = 0; + highlightState = null; +} + +/** + * Start audio visualization for avatar pulsing based on volume + */ +function startAudioVisualization(messageId) { + if (!currentPlayingAudio) return; + + try { + // Create AudioContext if not exists + if (!audioContext) { + audioContext = new (window.AudioContext || window.webkitAudioContext)(); + } + + // Create analyzer if not exists + if (!analyser) { + analyser = audioContext.createAnalyser(); + analyser.fftSize = 256; + } + + // Only create a new source if we don't have one or audio element changed + if (!currentAudioSource || currentAudioSource.mediaElement !== currentPlayingAudio) { + // Disconnect old source if exists + if (currentAudioSource) { + try { + currentAudioSource.disconnect(); + } catch (e) { + // Ignore disconnect errors + } + } + + // Create new source and connect + currentAudioSource = audioContext.createMediaElementSource(currentPlayingAudio); + currentAudioSource.connect(analyser); + analyser.connect(audioContext.destination); + } + + const dataArray = new Uint8Array(analyser.frequencyBinCount); + const avatar = document.querySelector(`[data-message-id="${messageId}"] .avatar`); + + if (!avatar) return; + + // Clear any existing interval + if (volumeCheckInterval) { + clearInterval(volumeCheckInterval); + } + + // Update avatar glow based on volume + volumeCheckInterval = setInterval(() => { + if (!currentPlayingAudio || currentPlayingAudio.paused || currentPlayingAudio.ended) { + return; // Don't stop completely, just pause updates + } + + analyser.getByteFrequencyData(dataArray); + + // Calculate average volume + const sum = dataArray.reduce((a, b) => a + b, 0); + const average = sum / dataArray.length; + + // Remove all volume classes + avatar.classList.remove('volume-low', 'volume-medium', 'volume-high', 'volume-peak'); + + // Add appropriate class based on volume level + if (average < 30) { + avatar.classList.add('volume-low'); + } else if (average < 60) { + avatar.classList.add('volume-medium'); + } else if (average < 90) { + avatar.classList.add('volume-high'); + } else { + avatar.classList.add('volume-peak'); + } + }, 50); // Update every 50ms for smooth visualization + + } catch (error) { + console.error('[TTS] Error setting up audio visualization:', error); + } +} + +/** + * Stop audio visualization + */ +function stopAudioVisualization(messageId) { + if (volumeCheckInterval) { + clearInterval(volumeCheckInterval); + volumeCheckInterval = null; + } + + // Remove volume classes from avatar + if (messageId) { + const avatar = document.querySelector(`[data-message-id="${messageId}"] .avatar`); + if (avatar) { + avatar.classList.remove('volume-low', 'volume-medium', 'volume-high', 'volume-peak'); + } + } +} + +function highlightPlayingMessage(messageId, highlight) { + const messageElement = document.querySelector(`[data-message-id="${messageId}"]`); + if (!messageElement) return; + + if (highlight) { + messageElement.classList.add('tts-playing'); + startAudioVisualization(messageId); + } else { + messageElement.classList.remove('tts-playing'); + stopAudioVisualization(messageId); + stopWordHighlighting(); + } +} + +/** + * Handle TTS button click + */ +export function handleTTSButtonClick(messageId, text) { + // If text is not provided, extract it from the DOM + if (!text || text.trim() === '') { + const messageElement = document.querySelector(`[data-message-id="${messageId}"]`); + if (messageElement) { + const messageTextDiv = messageElement.querySelector('.message-text'); + if (messageTextDiv) { + text = messageTextDiv.innerText || messageTextDiv.textContent; + } + } + + // If still no text, show error + if (!text || text.trim() === '') { + showToast('No text to read', 'warning'); + return; + } + } + + // If this message is currently playing, pause it + if (currentPlayingMessageId === messageId && currentPlayingAudio) { + if (currentPlayingAudio.paused) { + resumeTTS(); + } else { + pauseTTS(); + } + } else { + // Play this message + playTTS(messageId, text); + } +} + +/** + * Create TTS button HTML + */ +export function createTTSButton(messageId) { + return ` + + `; +} + +/** + * Auto-play TTS for new AI messages if enabled + */ +export function autoplayTTSIfEnabled(messageId, text) { + console.log('[TTS Autoplay] Check:', { ttsEnabled, ttsAutoplay, messageId, hasText: !!text }); + if (ttsEnabled && ttsAutoplay) { + console.log('[TTS Autoplay] Playing message:', messageId); + + // Wait for button to be rendered before playing + const waitForButton = (attempts = 0) => { + const button = document.querySelector(`[data-message-id="${messageId}"] .tts-play-btn`); + + if (button) { + console.log('[TTS Autoplay] Button found, starting playback'); + playTTS(messageId, text); + } else if (attempts < 10) { + // Retry up to 10 times (1 second total) + console.log(`[TTS Autoplay] Button not found, retry ${attempts + 1}/10`); + setTimeout(() => waitForButton(attempts + 1), 100); + } else { + console.warn('[TTS Autoplay] Button not found after 10 attempts, skipping autoplay'); + } + }; + + // Start checking for button after small delay + setTimeout(() => waitForButton(), 100); + } +} + +/** + * Toggle TTS autoplay on/off + */ +export async function toggleTTSAutoplay() { + ttsAutoplay = !ttsAutoplay; + + console.log('[TTS Autoplay] Toggled to:', ttsAutoplay); + + // Save to user settings + try { + const response = await fetch('/api/user/settings', { + method: 'POST', + headers: { + 'Content-Type': 'application/json' + }, + body: JSON.stringify({ + settings: { + ttsAutoplay: ttsAutoplay + } + }) + }); + + if (!response.ok) { + throw new Error('Failed to save autoplay setting'); + } + + // Update button UI + updateAutoplayButton(); + + // Show toast notification + const message = ttsAutoplay ? 'AI Voice enabled' : 'AI Voice disabled'; + showToast(message, 'success'); + + } catch (error) { + console.error('Error saving AI Voice setting:', error); + showToast('Failed to save AI Voice setting', 'danger'); + // Revert the toggle + ttsAutoplay = !ttsAutoplay; + } +} + +/** + * Update the autoplay button UI based on current state + */ +export function updateAutoplayButton() { + const button = document.getElementById('tts-autoplay-toggle-btn'); + if (!button) return; + + const icon = button.querySelector('i'); + if (ttsAutoplay) { + icon.className = 'bi bi-volume-up-fill'; + button.title = 'Auto voice response enabled - click to disable'; + button.classList.remove('btn-outline-secondary'); + button.classList.add('btn-primary'); + } else { + icon.className = 'bi bi-volume-mute'; + button.title = 'Auto voice response disabled - click to enable'; + button.classList.remove('btn-primary'); + button.classList.add('btn-outline-secondary'); + } +} + +/** + * Initialize the autoplay button state and event listener + */ +export function initializeAutoplayButton() { + const button = document.getElementById('tts-autoplay-toggle-btn'); + if (!button) return; + + button.addEventListener('click', toggleTTSAutoplay); + updateAutoplayButton(); +} + +// Export functions for global access +window.chatTTS = { + handleButtonClick: handleTTSButtonClick, + stop: stopTTS, + pause: pauseTTS, + resume: resumeTTS, + toggleAutoplay: toggleTTSAutoplay +}; + +// Initialize autoplay button when module loads +initializeAutoplayButton(); diff --git a/application/single_app/static/js/control-center.js b/application/single_app/static/js/control-center.js index 0f64dd37..e804865b 100644 --- a/application/single_app/static/js/control-center.js +++ b/application/single_app/static/js/control-center.js @@ -2,6 +2,141 @@ // Control Center JavaScript functionality // Handles user management, pagination, modals, and API interactions +import { showToast } from "./chat/chat-toast.js"; + +// Group Table Sorter - similar to user table but for groups +class GroupTableSorter { + constructor(tableId) { + this.table = document.getElementById(tableId); + this.currentSort = { column: null, direction: 'asc' }; + this.initializeSorting(); + } + + initializeSorting() { + if (!this.table) return; + + const headers = this.table.querySelectorAll('th.sortable'); + headers.forEach(header => { + header.addEventListener('click', () => { + const sortKey = header.getAttribute('data-sort'); + this.sortTable(sortKey, header); + }); + }); + } + + sortTable(sortKey, headerElement) { + const tbody = this.table.querySelector('tbody'); + const rows = Array.from(tbody.querySelectorAll('tr')).filter(row => + !row.querySelector('td[colspan]') // Exclude loading/empty rows + ); + + // Toggle sort direction + if (this.currentSort.column === sortKey) { + this.currentSort.direction = this.currentSort.direction === 'asc' ? 'desc' : 'asc'; + } else { + this.currentSort.direction = 'asc'; + } + this.currentSort.column = sortKey; + + // Remove sorting classes from all headers + this.table.querySelectorAll('th.sortable').forEach(th => { + th.classList.remove('sort-asc', 'sort-desc'); + }); + + // Add sorting class to current header + headerElement.classList.add(this.currentSort.direction === 'asc' ? 'sort-asc' : 'sort-desc'); + + // Sort rows + const sortedRows = rows.sort((a, b) => { + let aValue = this.getCellValue(a, sortKey); + let bValue = this.getCellValue(b, sortKey); + + // Handle different data types + if (sortKey === 'members' || sortKey === 'documents') { + // Numeric sorting for numbers and dates + aValue = this.parseNumericValue(aValue); + bValue = this.parseNumericValue(bValue); + + if (this.currentSort.direction === 'asc') { + return aValue - bValue; + } else { + return bValue - aValue; + } + } else { + // String sorting for text values + const result = aValue.localeCompare(bValue, undefined, { numeric: true, sensitivity: 'base' }); + return this.currentSort.direction === 'asc' ? result : -result; + } + }); + + // Clear tbody and append sorted rows + tbody.innerHTML = ''; + sortedRows.forEach(row => tbody.appendChild(row)); + } + + getCellValue(row, sortKey) { + const cellIndex = this.getColumnIndex(sortKey); + if (cellIndex === -1) return ''; + + const cell = row.cells[cellIndex]; + if (!cell) return ''; + + // Extract text content, handling different cell structures + let value = ''; + + switch (sortKey) { + case 'name': + // Extract group name + const nameElement = cell.querySelector('.fw-bold') || cell; + value = nameElement.textContent.trim(); + break; + case 'owner': + // Extract owner name + value = cell.textContent.trim(); + break; + case 'members': + // Extract member count + const memberText = cell.textContent.trim(); + const memberMatch = memberText.match(/(\d+)/); + value = memberMatch ? memberMatch[1] : '0'; + break; + case 'status': + // Extract status from badge + const statusBadge = cell.querySelector('.group-status-badge, .badge'); + value = statusBadge ? statusBadge.textContent.trim() : cell.textContent.trim(); + break; + case 'documents': + // Extract document count + const docText = cell.textContent.trim(); + const docMatch = docText.match(/(\d+)/); + value = docMatch ? docMatch[1] : '0'; + break; + default: + value = cell.textContent.trim(); + } + + return value; + } + + getColumnIndex(sortKey) { + const headers = this.table.querySelectorAll('th'); + for (let i = 0; i < headers.length; i++) { + if (headers[i].getAttribute('data-sort') === sortKey) { + return i; + } + } + return -1; + } + + parseNumericValue(value) { + if (!value || value === '' || value.toLowerCase() === 'never') return 0; + + // Extract numeric value from string + const numMatch = value.match(/(\d+)/); + return numMatch ? parseInt(numMatch[1]) : 0; + } +} + class ControlCenter { constructor() { this.currentPage = 1; @@ -29,15 +164,24 @@ class ControlCenter { init() { this.bindEvents(); - this.loadUsers(); - this.loadActivityTrends(); - // Also load groups and public workspaces on initial page load - // This ensures they get their cached metrics on first load - setTimeout(() => { - this.loadGroups(); - this.loadPublicWorkspaces(); - }, 500); // Small delay to ensure DOM is ready + // Check if user has admin role (passed from backend) + const hasAdminRole = window.hasControlCenterAdmin === true; + + // Only load admin features if user has ControlCenterAdmin role + if (hasAdminRole) { + this.loadUsers(); + + // Also load groups and public workspaces on initial page load + // This ensures they get their cached metrics on first load + setTimeout(() => { + this.loadGroups(); + this.loadPublicWorkspaces(); + }, 500); // Small delay to ensure DOM is ready + } + + // Always load activity trends (available to all Control Center users) + this.loadActivityTrends(); } bindEvents() { @@ -122,6 +266,10 @@ class ControlCenter { // User management modal document.getElementById('saveUserChangesBtn')?.addEventListener('click', () => this.saveUserChanges()); + document.getElementById('deleteUserDocumentsBtn')?.addEventListener('click', + () => this.deleteUserDocuments()); + document.getElementById('confirmDeleteUserDocumentsBtn')?.addEventListener('click', + () => this.confirmDeleteUserDocuments()); // Modal controls document.getElementById('accessStatusSelect')?.addEventListener('change', @@ -339,18 +487,9 @@ class ControlCenter { } renderChatMetrics(chatMetrics) { - if (!chatMetrics) { - return '
No data
Use Refresh Data button
'; - } - - const totalConversations = chatMetrics.total_conversations || 0; - const totalMessages = chatMetrics.total_messages || 0; - const messageSize = chatMetrics.total_message_size || 0; - - // If all values are zero/empty, show refresh message - if (totalConversations === 0 && totalMessages === 0 && messageSize === 0) { - return '
No cached data
Use Refresh Data button
'; - } + const totalConversations = chatMetrics?.total_conversations || 0; + const totalMessages = chatMetrics?.total_messages || 0; + const messageSize = chatMetrics?.total_message_size || 0; return `
@@ -362,21 +501,12 @@ class ControlCenter { } renderDocumentMetrics(docMetrics) { - if (!docMetrics) { - return '
No data
Use Refresh Data button
'; - } - - const totalDocs = docMetrics.total_documents || 0; - const aiSearchSize = docMetrics.ai_search_size || 0; - const storageSize = docMetrics.storage_account_size || 0; + const totalDocs = docMetrics?.total_documents || 0; + const aiSearchSize = docMetrics?.ai_search_size || 0; + const storageSize = docMetrics?.storage_account_size || 0; // Always get enhanced citation setting from app settings, not user data const enhancedCitation = (typeof appSettings !== 'undefined' && appSettings.enable_enhanced_citations) || false; - const personalWorkspace = docMetrics.personal_workspace_enabled; - - // If all values are zero/empty, show refresh message - if (totalDocs === 0 && aiSearchSize === 0 && storageSize === 0) { - return '
No cached data
Use Refresh Data button
'; - } + const personalWorkspace = docMetrics?.personal_workspace_enabled; let html = `
@@ -401,21 +531,12 @@ class ControlCenter { } renderGroupDocumentMetrics(docMetrics) { - if (!docMetrics) { - return '
No data
Use Refresh Data button
'; - } - - const totalDocs = docMetrics.total_documents || 0; - const aiSearchSize = docMetrics.ai_search_size || 0; - const storageSize = docMetrics.storage_account_size || 0; + const totalDocs = docMetrics?.total_documents || 0; + const aiSearchSize = docMetrics?.ai_search_size || 0; + const storageSize = docMetrics?.storage_account_size || 0; // Always get enhanced citation setting from app settings, not user data const enhancedCitation = (typeof appSettings !== 'undefined' && appSettings.enable_enhanced_citations) || false; - // If all values are zero/empty, show refresh message - if (totalDocs === 0 && aiSearchSize === 0 && storageSize === 0) { - return '
No cached data
Use Refresh Data button
'; - } - let html = `
Total Docs: ${totalDocs}
@@ -432,19 +553,10 @@ class ControlCenter { } renderLoginActivity(loginMetrics) { - if (!loginMetrics) { - return '
No login data
Use Refresh Data button
'; - } + const totalLogins = loginMetrics?.total_logins || 0; + const lastLogin = loginMetrics?.last_login; - const totalLogins = loginMetrics.total_logins || 0; - const lastLogin = loginMetrics.last_login; - - // If no logins recorded and no last login, show refresh message - if (totalLogins === 0 && !lastLogin) { - return '
No cached data
Use Refresh Data button
'; - } - - let lastLoginFormatted = 'Never'; + let lastLoginFormatted = 'None'; if (lastLogin) { try { const date = new Date(lastLogin); @@ -455,7 +567,7 @@ class ControlCenter { year: 'numeric' }); } catch { - lastLoginFormatted = 'Invalid date'; + lastLoginFormatted = 'None'; } } @@ -641,13 +753,23 @@ class ControlCenter { // Extract user info from table row const nameCell = cells[1]; const userName = nameCell.querySelector('.fw-semibold')?.textContent || 'Unknown User'; - const userEmail = nameCell.querySelector('.text-muted')?.textContent || ''; + const userEmail = nameCell.querySelectorAll('.text-muted')[0]?.textContent || ''; + + // Extract document count from cell 6 (Document Metrics column) + const docMetricsCell = cells[6]; + const totalDocsText = docMetricsCell?.querySelector('div > div:first-child')?.textContent || ''; + const docCount = totalDocsText.match(/Total Docs:\s*(\d+)/)?.[1] || '0'; + + // Extract last login from cell 4 (Login Activity column) + const loginActivityCell = cells[4]; + const lastLoginText = loginActivityCell?.querySelector('div > div:first-child')?.textContent || ''; + const lastLogin = lastLoginText.replace('Last Login:', '').trim() || 'None'; // Populate modal document.getElementById('modalUserName').textContent = userName; document.getElementById('modalUserEmail').textContent = userEmail; - document.getElementById('modalUserDocuments').textContent = cells[4]?.textContent.split('\n')[0] || '0 docs'; - document.getElementById('modalUserLastActivity').textContent = cells[4]?.textContent.split('\n')[1]?.replace('Last: ', '') || 'Unknown'; + document.getElementById('modalUserDocuments').textContent = `${docCount} docs`; + document.getElementById('modalUserLastActivity').textContent = lastLogin; // Set current user this.currentUser = { id: userId, name: userName, email: userEmail }; @@ -685,6 +807,66 @@ class ControlCenter { } } + deleteUserDocuments() { + if (!this.currentUser) { + this.showError('No user selected'); + return; + } + + // Clear previous reason and show confirmation modal + document.getElementById('deleteUserDocumentsReason').value = ''; + const deleteModal = new bootstrap.Modal(document.getElementById('deleteUserDocumentsModal')); + deleteModal.show(); + } + + async confirmDeleteUserDocuments() { + if (!this.currentUser) { + this.showError('No user selected'); + return; + } + + const reason = document.getElementById('deleteUserDocumentsReason').value.trim(); + const confirmBtn = document.getElementById('confirmDeleteUserDocumentsBtn'); + + if (!reason) { + this.showError('Please provide a reason for deleting this user\'s documents'); + return; + } + + // Disable button during request + confirmBtn.disabled = true; + confirmBtn.innerHTML = 'Submitting...'; + + try { + const response = await fetch(`/api/admin/control-center/users/${this.currentUser.id}/delete-documents`, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ reason }) + }); + + const data = await response.json(); + + if (!response.ok) { + throw new Error(data.error || 'Failed to create document deletion request'); + } + + // Close both modals + bootstrap.Modal.getInstance(document.getElementById('deleteUserDocumentsModal')).hide(); + bootstrap.Modal.getInstance(document.getElementById('userManagementModal')).hide(); + + this.showSuccess('Document deletion request created successfully. It requires approval from another admin.'); + + // Refresh user list + this.loadUsers(); + + } catch (error) { + this.showError(error.message); + } finally { + confirmBtn.disabled = false; + confirmBtn.innerHTML = 'Submit Request'; + } + } + async saveUserChanges() { if (!this.currentUser) return; @@ -1147,7 +1329,9 @@ class ControlCenter { // Activity Trends Methods async loadActivityTrends() { try { - console.log('🔍 [Frontend Debug] Loading activity trends for', this.currentTrendDays, 'days'); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Loading activity trends for', this.currentTrendDays, 'days'); + } // Build API URL with custom date range if specified let apiUrl = `/api/admin/control-center/activity-trends?days=${this.currentTrendDays}`; @@ -1156,13 +1340,19 @@ class ControlCenter { } const response = await fetch(apiUrl); - console.log('🔍 [Frontend Debug] API response status:', response.status); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] API response status:', response.status); + } const data = await response.json(); - console.log('🔍 [Frontend Debug] API response data:', data); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] API response data:', data); + } if (response.ok) { - console.log('🔍 [Frontend Debug] Activity data received:', data.activity_data); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Activity data received:', data.activity_data); + } // Render all four charts this.renderLoginsChart(data.activity_data); this.renderChatsChart(data.activity_data); @@ -1185,7 +1375,9 @@ class ControlCenter { } renderLoginsChart(activityData) { - console.log('🔍 [Frontend Debug] Rendering logins chart with data:', activityData.logins); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Rendering logins chart with data:', activityData.logins); + } this.renderSingleChart('loginsChart', 'logins', activityData.logins, { label: 'Logins', backgroundColor: 'rgba(255, 193, 7, 0.2)', @@ -1194,25 +1386,125 @@ class ControlCenter { } renderChatsChart(activityData) { - console.log('🔍 [Frontend Debug] Rendering chats chart with data:', activityData.chats); - this.renderSingleChart('chatsChart', 'chats', activityData.chats, { - label: 'Chats', - backgroundColor: 'rgba(13, 110, 253, 0.2)', - borderColor: '#0d6efd' + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Rendering chats chart with data:', activityData); + } + + // Check if Chart.js is available + if (typeof Chart === 'undefined') { + console.error(`❌ [Frontend Debug] Chart.js is not loaded. Cannot render chats chart.`); + this.showChartError('chatsChart', 'chats'); + return; + } + + const canvas = document.getElementById('chatsChart'); + if (!canvas) { + console.error(`❌ [Frontend Debug] Chart canvas element chatsChart not found`); + return; + } + + const ctx = canvas.getContext('2d'); + if (!ctx) { + console.error(`❌ [Frontend Debug] Could not get 2D context from chatsChart canvas`); + return; + } + + // Show canvas + canvas.style.display = 'block'; + + // Destroy existing chart if it exists + if (this.chatsChart) { + this.chatsChart.destroy(); + } + + // Get data for created and deleted chats + const createdData = activityData.chats_created || {}; + const deletedData = activityData.chats_deleted || {}; + const allDates = [...new Set([...Object.keys(createdData), ...Object.keys(deletedData)])].sort(); + + const labels = allDates.map(date => { + const dateObj = new Date(date); + return dateObj.toLocaleDateString('en-US', { month: 'short', day: 'numeric' }); + }); + + const createdValues = allDates.map(date => createdData[date] || 0); + const deletedValues = allDates.map(date => deletedData[date] || 0); + + const datasets = [ + { + label: 'New Chats', + data: createdValues, + borderColor: '#0d6efd', + backgroundColor: 'rgba(13, 110, 253, 0.1)', + borderWidth: 2, + fill: true, + tension: 0.4, + type: 'line' + }, + { + label: 'Deleted Chats', + data: deletedValues, + backgroundColor: 'rgba(220, 53, 69, 0.7)', + borderColor: '#dc3545', + borderWidth: 1, + type: 'bar' + } + ]; + + this.chatsChart = new Chart(ctx, { + type: 'bar', + data: { + labels: labels, + datasets: datasets + }, + options: { + responsive: true, + maintainAspectRatio: false, + plugins: { + legend: { + display: true, + position: 'top', + labels: { + usePointStyle: true, + padding: 15 + } + } + }, + scales: { + x: { + display: true, + grid: { display: false } + }, + y: { + display: true, + beginAtZero: true, + grid: { color: 'rgba(0, 0, 0, 0.1)' }, + ticks: { precision: 0 } + } + } + } }); } renderDocumentsChart(activityData) { - console.log('🔍 [Frontend Debug] Rendering documents chart with personal, group, and public data'); - console.log('🔍 [Frontend Debug] Personal documents:', activityData.personal_documents); - console.log('🔍 [Frontend Debug] Group documents:', activityData.group_documents); - console.log('🔍 [Frontend Debug] Public documents:', activityData.public_documents); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Rendering documents chart with creation/deletion data'); + console.log('🔍 [Frontend Debug] Personal created:', activityData.personal_documents_created); + console.log('🔍 [Frontend Debug] Personal deleted:', activityData.personal_documents_deleted); + console.log('🔍 [Frontend Debug] Group created:', activityData.group_documents_created); + console.log('🔍 [Frontend Debug] Group deleted:', activityData.group_documents_deleted); + console.log('🔍 [Frontend Debug] Public created:', activityData.public_documents_created); + console.log('🔍 [Frontend Debug] Public deleted:', activityData.public_documents_deleted); + } - // Render combined chart with personal, group, and public documents + // Render combined chart with creations (lines) and deletions (bars) this.renderCombinedDocumentsChart('documentsChart', { - personal: activityData.personal_documents || {}, - group: activityData.group_documents || {}, - public: activityData.public_documents || {} + personal_created: activityData.personal_documents_created || {}, + personal_deleted: activityData.personal_documents_deleted || {}, + group_created: activityData.group_documents_created || {}, + group_deleted: activityData.group_documents_deleted || {}, + public_created: activityData.public_documents_created || {}, + public_deleted: activityData.public_documents_deleted || {} }); } @@ -1249,10 +1541,14 @@ class ControlCenter { } // Prepare data for Chart.js - get all unique dates and sort them - const personalDates = Object.keys(documentsData.personal || {}); - const groupDates = Object.keys(documentsData.group || {}); - const publicDates = Object.keys(documentsData.public || {}); - const allDates = [...new Set([...personalDates, ...groupDates, ...publicDates])].sort(); + const allDates = [...new Set([ + ...Object.keys(documentsData.personal_created || {}), + ...Object.keys(documentsData.personal_deleted || {}), + ...Object.keys(documentsData.group_created || {}), + ...Object.keys(documentsData.group_deleted || {}), + ...Object.keys(documentsData.public_created || {}), + ...Object.keys(documentsData.public_deleted || {}) + ])].sort(); console.log(`🔍 [Frontend Debug] Documents date range:`, allDates); @@ -1261,42 +1557,77 @@ class ControlCenter { return dateObj.toLocaleDateString('en-US', { month: 'short', day: 'numeric' }); }); - // Prepare datasets for personal, group, and public documents - const personalData = allDates.map(date => (documentsData.personal || {})[date] || 0); - const groupData = allDates.map(date => (documentsData.group || {})[date] || 0); - const publicData = allDates.map(date => (documentsData.public || {})[date] || 0); - - console.log(`🔍 [Frontend Debug] Personal documents data:`, personalData); - console.log(`🔍 [Frontend Debug] Group documents data:`, groupData); - console.log(`🔍 [Frontend Debug] Public documents data:`, publicData); + // Prepare datasets - lines for creations, bars for deletions + const personalCreated = allDates.map(date => (documentsData.personal_created || {})[date] || 0); + const personalDeleted = allDates.map(date => (documentsData.personal_deleted || {})[date] || 0); + const groupCreated = allDates.map(date => (documentsData.group_created || {})[date] || 0); + const groupDeleted = allDates.map(date => (documentsData.group_deleted || {})[date] || 0); + const publicCreated = allDates.map(date => (documentsData.public_created || {})[date] || 0); + const publicDeleted = allDates.map(date => (documentsData.public_deleted || {})[date] || 0); + + console.log(`🔍 [Frontend Debug] Personal created:`, personalCreated); + console.log(`🔍 [Frontend Debug] Personal deleted:`, personalDeleted); + console.log(`🔍 [Frontend Debug] Group created:`, groupCreated); + console.log(`🔍 [Frontend Debug] Group deleted:`, groupDeleted); + console.log(`🔍 [Frontend Debug] Public created:`, publicCreated); + console.log(`🔍 [Frontend Debug] Public deleted:`, publicDeleted); const datasets = [ + // Lines for new documents { - label: 'Personal', - data: personalData, - backgroundColor: 'rgba(144, 238, 144, 0.4)', // Light green - borderColor: '#90EE90', // Light green + label: 'Personal (New)', + data: personalCreated, + borderColor: '#90EE90', + backgroundColor: 'rgba(144, 238, 144, 0.1)', borderWidth: 2, - fill: false, - tension: 0.1 + fill: true, + tension: 0.4, + type: 'line' }, { - label: 'Group', - data: groupData, - backgroundColor: 'rgba(34, 139, 34, 0.4)', // Medium green (forest green) - borderColor: '#228B22', // Medium green (forest green) + label: 'Group (New)', + data: groupCreated, + borderColor: '#228B22', + backgroundColor: 'rgba(34, 139, 34, 0.1)', borderWidth: 2, - fill: false, - tension: 0.1 + fill: true, + tension: 0.4, + type: 'line' }, { - label: 'Public', - data: publicData, - backgroundColor: 'rgba(0, 100, 0, 0.4)', // Dark green - borderColor: '#006400', // Dark green + label: 'Public (New)', + data: publicCreated, + borderColor: '#006400', + backgroundColor: 'rgba(0, 100, 0, 0.1)', borderWidth: 2, - fill: false, - tension: 0.1 + fill: true, + tension: 0.4, + type: 'line' + }, + // Bars for deleted documents + { + label: 'Personal (Deleted)', + data: personalDeleted, + backgroundColor: 'rgba(255, 182, 193, 0.7)', + borderColor: '#FFB6C1', + borderWidth: 1, + type: 'bar' + }, + { + label: 'Group (Deleted)', + data: groupDeleted, + backgroundColor: 'rgba(220, 53, 69, 0.7)', + borderColor: '#dc3545', + borderWidth: 1, + type: 'bar' + }, + { + label: 'Public (Deleted)', + data: publicDeleted, + backgroundColor: 'rgba(139, 0, 0, 0.7)', + borderColor: '#8B0000', + borderWidth: 1, + type: 'bar' } ]; @@ -1305,7 +1636,7 @@ class ControlCenter { // Create new chart try { this.documentsChart = new Chart(ctx, { - type: 'line', + type: 'bar', data: { labels: labels, datasets: datasets @@ -1315,11 +1646,12 @@ class ControlCenter { maintainAspectRatio: false, plugins: { legend: { - display: true, // Show legend for multiple datasets + display: true, position: 'top', labels: { usePointStyle: true, - padding: 15 + padding: 10, + font: { size: 10 } } }, tooltip: { @@ -1374,7 +1706,9 @@ class ControlCenter { } renderTokensChart(activityData) { - console.log('🔍 [Frontend Debug] Rendering tokens chart with data:', activityData.tokens); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Rendering tokens chart with data:', activityData.tokens); + } // Render combined chart with embedding and chat tokens this.renderCombinedTokensChart('tokensChart', activityData.tokens || {}); @@ -1405,13 +1739,17 @@ class ControlCenter { // Destroy existing chart if it exists if (this.tokensChart) { - console.log('🔍 [Frontend Debug] Destroying existing tokens chart'); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Destroying existing tokens chart'); + } this.tokensChart.destroy(); } // Prepare data from tokens object (format: { "YYYY-MM-DD": { "embedding": count, "chat": count } }) const allDates = Object.keys(tokensData).sort(); - console.log('🔍 [Frontend Debug] Token dates:', allDates); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Token dates:', allDates); + } // Format labels for display const labels = allDates.map(dateStr => { @@ -1423,8 +1761,10 @@ class ControlCenter { const embeddingTokens = allDates.map(date => tokensData[date]?.embedding || 0); const chatTokens = allDates.map(date => tokensData[date]?.chat || 0); - console.log('🔍 [Frontend Debug] Embedding tokens:', embeddingTokens); - console.log('🔍 [Frontend Debug] Chat tokens:', chatTokens); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Embedding tokens:', embeddingTokens); + console.log('🔍 [Frontend Debug] Chat tokens:', chatTokens); + } // Create datasets const datasets = [ @@ -1716,12 +2056,12 @@ class ControlCenter { const endDate = document.getElementById('endDate').value; if (!startDate || !endDate) { - alert('Please select both start and end dates.'); + showToast('Please select both start and end dates.', 'warning'); return; } if (new Date(startDate) > new Date(endDate)) { - alert('Start date must be before end date.'); + showToast('Start date must be before end date.', 'warning'); return; } @@ -1768,7 +2108,7 @@ class ControlCenter { if (document.getElementById('exportTokens').checked) selectedCharts.push('tokens'); if (selectedCharts.length === 0) { - alert('Please select at least one chart to export.'); + showToast('Please select at least one chart to export.', 'warning'); return; } @@ -1787,12 +2127,12 @@ class ControlCenter { const endDate = document.getElementById('exportEndDate').value; if (!startDate || !endDate) { - alert('Please select both start and end dates for custom range.'); + showToast('Please select both start and end dates for custom range.', 'warning'); return; } if (new Date(startDate) > new Date(endDate)) { - alert('Start date must be before end date.'); + showToast('Start date must be before end date.', 'warning'); return; } @@ -1900,6 +2240,9 @@ class ControlCenter { } renderActivityLogs(logs, userMap) { + // Store logs for modal access + this.currentActivityLogs = logs; + const tbody = document.getElementById('activityLogsTableBody'); if (!tbody) return; @@ -1915,15 +2258,27 @@ class ControlCenter { } tbody.innerHTML = logs.map(log => { - const user = userMap[log.user_id] || {}; - const userName = user.display_name || user.email || log.user_id; + // Handle user identification - some activities may not have user_id (system activities) + let userName = 'System'; + if (log.user_id) { + const user = userMap[log.user_id] || {}; + userName = user.display_name || user.email || log.user_id || 'Unknown User'; + } else if (log.admin_email) { + userName = log.admin_email; + } else if (log.requester_email) { + userName = log.requester_email; + } else if (log.added_by_email) { + userName = log.added_by_email; + } + const timestamp = new Date(log.timestamp).toLocaleString(); const activityType = this.formatActivityType(log.activity_type); const details = this.formatActivityDetails(log); const workspaceType = log.workspace_type || 'N/A'; + const logIndex = logs.indexOf(log); return ` - + ${timestamp} ${activityType} ${this.escapeHtml(userName)} @@ -1938,12 +2293,20 @@ class ControlCenter { const typeMap = { 'user_login': 'User Login', 'conversation_creation': 'Conversation Created', + 'conversation_deletion': 'Conversation Deleted', + 'conversation_archival': 'Conversation Archived', 'document_creation': 'Document Created', + 'document_deletion': 'Document Deleted', + 'document_metadata_update': 'Document Metadata Updated', 'token_usage': 'Token Usage', - 'conversation_deletion': 'Conversation Deleted', - 'conversation_archival': 'Conversation Archived' + 'group_status_change': 'Group Status Change', + 'group_member_deleted': 'Group Member Deleted', + 'add_member_directly': 'Add Member Directly', + 'admin_take_ownership_approved': 'Admin Take Ownership (Approved)', + 'delete_group_approved': 'Delete Group (Approved)', + 'delete_all_documents_approved': 'Delete All Documents (Approved)' }; - return typeMap[activityType] || activityType; + return typeMap[activityType] || activityType.replace(/_/g, ' ').replace(/\b\w/g, l => l.toUpperCase()); } formatActivityDetails(log) { @@ -1958,26 +2321,119 @@ class ControlCenter { const convId = log.conversation?.conversation_id || 'N/A'; return `Title: ${this.escapeHtml(convTitle)}
ID: ${convId}`; + case 'conversation_deletion': + const delTitle = log.conversation?.title || 'Untitled'; + const delId = log.conversation?.conversation_id || 'N/A'; + return `Deleted: ${this.escapeHtml(delTitle)}
ID: ${delId}`; + + case 'conversation_archival': + const archTitle = log.conversation?.title || 'Untitled'; + const archId = log.conversation?.conversation_id || 'N/A'; + return `Archived: ${this.escapeHtml(archTitle)}
ID: ${archId}`; + case 'document_creation': const fileName = log.document?.file_name || 'Unknown'; const fileType = log.document?.file_type || ''; return `File: ${this.escapeHtml(fileName)}
Type: ${fileType}`; + case 'document_deletion': + const delFileName = log.document?.file_name || 'Unknown'; + const delFileType = log.document?.file_type || ''; + return `Deleted: ${this.escapeHtml(delFileName)}
Type: ${delFileType}`; + + case 'document_metadata_update': + const updatedFileName = log.document?.file_name || 'Unknown'; + const updatedFields = Object.keys(log.updated_fields || {}).join(', ') || 'N/A'; + return `File: ${this.escapeHtml(updatedFileName)}
Updated: ${updatedFields}`; + case 'token_usage': const tokenType = log.token_type || 'unknown'; const totalTokens = log.usage?.total_tokens || 0; const model = log.usage?.model || 'N/A'; return `Type: ${tokenType}
Tokens: ${totalTokens.toLocaleString()}
Model: ${model}`; - case 'conversation_deletion': - const delTitle = log.conversation?.title || 'Untitled'; - const delId = log.conversation?.conversation_id || 'N/A'; - return `Deleted: ${this.escapeHtml(delTitle)}
ID: ${delId}`; + case 'group_status_change': + const groupName = log.group?.group_name || 'Unknown Group'; + const oldStatus = log.status_change?.old_status || 'N/A'; + const newStatus = log.status_change?.new_status || 'N/A'; + return `Group: ${this.escapeHtml(groupName)}
Status: ${oldStatus} → ${newStatus}`; - case 'conversation_archival': - const archTitle = log.conversation?.title || 'Untitled'; - const archId = log.conversation?.conversation_id || 'N/A'; - return `Archived: ${this.escapeHtml(archTitle)}
ID: ${archId}`; + case 'group_member_deleted': + const memberName = log.removed_member?.name || log.removed_member?.email || 'Unknown'; + const memberGroupName = log.group?.group_name || 'Unknown Group'; + return `Removed: ${this.escapeHtml(memberName)}
From: ${this.escapeHtml(memberGroupName)}`; + + case 'add_member_directly': + const addedMemberName = log.member_name || log.member_email || 'Unknown'; + const addedToGroup = log.group_name || 'Unknown Group'; + const memberRole = log.member_role || 'user'; + return `Added: ${this.escapeHtml(addedMemberName)}
To: ${this.escapeHtml(addedToGroup)} (${memberRole})`; + + case 'admin_take_ownership_approved': + const ownershipGroup = log.group_name || 'Unknown Group'; + const oldOwner = log.old_owner_email || 'Unknown'; + const newOwner = log.new_owner_email || 'Unknown'; + const approver = log.approver_email || 'N/A'; + return `Group: ${this.escapeHtml(ownershipGroup)}
Old Owner: ${this.escapeHtml(oldOwner)}
New Owner: ${this.escapeHtml(newOwner)}
Approved by: ${this.escapeHtml(approver)}`; + + case 'delete_group_approved': + const deletedGroup = log.group_name || 'Unknown Group'; + const requester = log.requester_email || 'Unknown'; + const delApprover = log.approver_email || 'N/A'; + return `Group: ${this.escapeHtml(deletedGroup)}
Requested by: ${this.escapeHtml(requester)}
Approved by: ${this.escapeHtml(delApprover)}`; + + case 'delete_all_documents_approved': + const docsGroup = log.group_name || 'Unknown Group'; + const docsDeleted = log.documents_deleted !== undefined ? log.documents_deleted : 'N/A'; + const docsRequester = log.requester_email || 'Unknown'; + const docsApprover = log.approver_email || 'N/A'; + return `Group: ${this.escapeHtml(docsGroup)}
Documents Deleted: ${docsDeleted}
Requested by: ${this.escapeHtml(docsRequester)}
Approved by: ${this.escapeHtml(docsApprover)}`; + + case 'public_workspace_status_change': + const workspaceName = log.public_workspace?.workspace_name || log.workspace_context?.public_workspace_name || log.public_workspace_name || 'Unknown Workspace'; + const wsOldStatus = log.status_change?.old_status || 'N/A'; + const wsNewStatus = log.status_change?.new_status || 'N/A'; + return `Workspace: ${this.escapeHtml(workspaceName)}
Status: ${wsOldStatus} → ${wsNewStatus}`; + + case 'admin_take_workspace_ownership_approved': + const wsOwnershipName = log.workspace_name || log.public_workspace_name || 'Unknown Workspace'; + const wsOldOwner = log.old_owner_email || 'Unknown'; + const wsNewOwner = log.new_owner_email || 'Unknown'; + const wsApprover = log.approver_email || 'N/A'; + return `Workspace: ${this.escapeHtml(wsOwnershipName)}
Old Owner: ${this.escapeHtml(wsOldOwner)}
New Owner: ${this.escapeHtml(wsNewOwner)}
Approved by: ${this.escapeHtml(wsApprover)}`; + + case 'transfer_workspace_ownership_approved': + const wsTransferName = log.workspace_name || log.public_workspace_name || 'Unknown Workspace'; + const wsTransferOldOwner = log.old_owner_email || 'Unknown'; + const wsTransferNewOwner = log.new_owner_email || 'Unknown'; + const wsTransferApprover = log.approver_email || 'N/A'; + return `Workspace: ${this.escapeHtml(wsTransferName)}
Old Owner: ${this.escapeHtml(wsTransferOldOwner)}
New Owner: ${this.escapeHtml(wsTransferNewOwner)}
Approved by: ${this.escapeHtml(wsTransferApprover)}`; + + case 'transfer_ownership_approved': + const transferGroup = log.group_name || 'Unknown Group'; + const transferOldOwner = log.old_owner_email || 'Unknown'; + const transferNewOwner = log.new_owner_email || 'Unknown'; + const transferApprover = log.approver_email || 'N/A'; + return `Group: ${this.escapeHtml(transferGroup)}
Old Owner: ${this.escapeHtml(transferOldOwner)}
New Owner: ${this.escapeHtml(transferNewOwner)}
Approved by: ${this.escapeHtml(transferApprover)}`; + + case 'add_workspace_member_directly': + const wsAddedMemberName = log.member_name || log.member_email || 'Unknown'; + const wsAddedTo = log.workspace_name || log.public_workspace_name || 'Unknown Workspace'; + const wsMemberRole = log.member_role || 'user'; + return `Added: ${this.escapeHtml(wsAddedMemberName)}
To: ${this.escapeHtml(wsAddedTo)} (${wsMemberRole})`; + + case 'delete_workspace_documents_approved': + const wsDocsName = log.workspace_name || log.public_workspace_name || 'Unknown Workspace'; + const wsDocsDeleted = log.documents_deleted !== undefined ? log.documents_deleted : 'N/A'; + const wsDocsRequester = log.requester_email || 'Unknown'; + const wsDocsApprover = log.approver_email || 'N/A'; + return `Workspace: ${this.escapeHtml(wsDocsName)}
Documents Deleted: ${wsDocsDeleted}
Requested by: ${this.escapeHtml(wsDocsRequester)}
Approved by: ${this.escapeHtml(wsDocsApprover)}`; + + case 'delete_workspace_approved': + const deletedWorkspace = log.workspace_name || log.public_workspace_name || 'Unknown Workspace'; + const wsDelRequester = log.requester_email || 'Unknown'; + const wsDelApprover = log.approver_email || 'N/A'; + return `Workspace: ${this.escapeHtml(deletedWorkspace)}
Requested by: ${this.escapeHtml(wsDelRequester)}
Approved by: ${this.escapeHtml(wsDelApprover)}`; default: return 'N/A'; @@ -2137,7 +2593,7 @@ class ControlCenter { } catch (error) { console.error('Error exporting activity logs:', error); - alert('Failed to export activity logs. Please try again.'); + showToast('Failed to export activity logs. Please try again.', 'danger'); } } @@ -2188,7 +2644,57 @@ class ControlCenter { return str.charAt(0).toUpperCase() + str.slice(1); } + showRawLogModal(logIndex) { + if (!this.currentActivityLogs || !this.currentActivityLogs[logIndex]) { + showToast('Log data not available', 'warning'); + return; + } + + const log = this.currentActivityLogs[logIndex]; + const modalBody = document.getElementById('rawLogModalBody'); + const modalTitle = document.getElementById('rawLogModalTitle'); + + if (!modalBody || !modalTitle) { + showToast('Modal elements not found', 'danger'); + return; + } + + // Set title + const activityType = this.formatActivityType(log.activity_type); + const timestamp = new Date(log.timestamp).toLocaleString(); + modalTitle.textContent = `${activityType} - ${timestamp}`; + + // Display JSON with pretty formatting + modalBody.innerHTML = `
${this.escapeHtml(JSON.stringify(log, null, 2))}
`; + + // Show modal + const modal = new bootstrap.Modal(document.getElementById('rawLogModal')); + modal.show(); + } + + copyRawLogToClipboard() { + const rawLogText = document.getElementById('rawLogModalBody')?.textContent; + if (!rawLogText) { + showToast('No log data to copy', 'warning'); + return; + } + + navigator.clipboard.writeText(rawLogText).then(() => { + this.showToast('Log data copied to clipboard', 'success'); + }).catch(err => { + console.error('Failed to copy:', err); + showToast('Failed to copy to clipboard', 'danger'); + }); + } + escapeHtml(text) { + // Handle undefined, null, or non-string values + if (text === undefined || text === null) { + return ''; + } + // Convert to string if not already + text = String(text); + const map = { '&': '&', '<': '<', @@ -2226,7 +2732,7 @@ class ControlCenter { if (document.getElementById('chatDocuments').checked) selectedCharts.push('documents'); if (selectedCharts.length === 0) { - alert('Please select at least one chart to include in the chat.'); + showToast('Please select at least one chart to include in the chat.', 'warning'); return; } @@ -2245,12 +2751,12 @@ class ControlCenter { const endDate = document.getElementById('chatEndDate').value; if (!startDate || !endDate) { - alert('Please select both start and end dates for custom range.'); + showToast('Please select both start and end dates for custom range.', 'warning'); return; } if (new Date(startDate) > new Date(endDate)) { - alert('Start date must be before end date.'); + showToast('Start date must be before end date.', 'warning'); return; } @@ -2329,7 +2835,9 @@ class ControlCenter { this.groupDocumentsChart.destroy(); this.groupDocumentsChart = null; } - console.log('🔍 [Frontend Debug] All charts destroyed'); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] All charts destroyed'); + } } showAllChartsError() { @@ -2341,7 +2849,9 @@ class ControlCenter { // Ensure main loading overlay is hidden when showing error this.showLoading(false); - console.log('🔍 [Frontend Debug] Main loading overlay hidden after all charts error'); + if (appSettings?.enable_debug_logging) { + console.log('🔍 [Frontend Debug] Main loading overlay hidden after all charts error'); + } } showChartError(canvasId, chartType) { @@ -2498,28 +3008,38 @@ class ControlCenter { createGroupRow(group) { // Format storage size - const storageSize = group.activity?.document_metrics?.storage_account_size || 0; + const storageSize = group.metrics?.document_metrics?.storage_account_size || group.activity?.document_metrics?.storage_account_size || 0; const storageSizeFormatted = storageSize > 0 ? this.formatBytes(storageSize) : '0 B'; // Format AI search size - const aiSearchSize = group.activity?.document_metrics?.ai_search_size || 0; + const aiSearchSize = group.metrics?.document_metrics?.ai_search_size || group.activity?.document_metrics?.ai_search_size || 0; const aiSearchSizeFormatted = aiSearchSize > 0 ? this.formatBytes(aiSearchSize) : '0 B'; // Get document metrics - const totalDocs = group.activity?.document_metrics?.total_documents || 0; + const totalDocs = group.metrics?.document_metrics?.total_documents || group.activity?.document_metrics?.total_documents || 0; // Get group info - const memberCount = group.member_count || 0; + const memberCount = group.member_count || (group.users ? group.users.length : 0); const ownerName = group.owner?.displayName || group.owner?.display_name || 'Unknown'; const ownerEmail = group.owner?.email || ''; + // Get status and format badge + const status = group.status || 'active'; + const statusConfig = { + 'active': { class: 'bg-success', text: 'Active' }, + 'locked': { class: 'bg-warning text-dark', text: 'Locked' }, + 'upload_disabled': { class: 'bg-info text-dark', text: 'Upload Disabled' }, + 'inactive': { class: 'bg-secondary', text: 'Inactive' } + }; + const statusInfo = statusConfig[status] || statusConfig['active']; + return ` -
${this.escapeHtml(group.name || 'Unnamed Group')}
+
${this.escapeHtml(group.name || 'Unnamed Group')}
${this.escapeHtml(group.description || 'No description')}
ID: ${group.id}
@@ -2528,20 +3048,21 @@ class ControlCenter {
${this.escapeHtml(ownerEmail)}
- Active -
${memberCount} members
+
${memberCount} member${memberCount === 1 ? '' : 's'}
- Active + ${statusInfo.text} -
Total Docs: ${totalDocs}
-
AI Search: ${aiSearchSizeFormatted}
-
Storage: ${storageSizeFormatted}
- ${group.activity?.document_metrics?.storage_account_size > 0 ? '
(Enhanced)
' : ''} +
+
Total Docs: ${totalDocs}
+
AI Search: ${aiSearchSizeFormatted}
+
Storage: ${storageSizeFormatted}
+ ${storageSize > 0 ? '
(Enhanced)
' : ''} +
- @@ -2558,9 +3079,14 @@ class ControlCenter { } manageGroup(groupId) { - // Placeholder for group management - can be implemented later - console.log('Managing group:', groupId); - alert('Group management functionality would open here'); + // Call the GroupManager's manageGroup function directly + console.log('ControlCenter.manageGroup() redirecting to GroupManager.manageGroup()'); + if (typeof GroupManager !== 'undefined' && GroupManager.manageGroup) { + GroupManager.manageGroup(groupId); + } else { + console.error('GroupManager not found or manageGroup method not available'); + showToast('Group management functionality is not available', 'danger'); + } } // Public Workspaces Management Methods @@ -2697,13 +3223,23 @@ class ControlCenter { const ownerName = workspace.owner?.displayName || workspace.owner?.display_name || workspace.owner_name || 'Unknown'; const ownerEmail = workspace.owner?.email || workspace.owner_email || ''; + // Get status and format badge + const status = workspace.status || 'active'; + const statusConfig = { + 'active': { class: 'bg-success', text: 'Active' }, + 'locked': { class: 'bg-warning text-dark', text: 'Locked' }, + 'upload_disabled': { class: 'bg-info text-dark', text: 'Upload Disabled' }, + 'inactive': { class: 'bg-secondary', text: 'Inactive' } + }; + const statusInfo = statusConfig[status] || statusConfig['active']; + return ` -
${workspace.name || 'Unnamed Workspace'}
+
${workspace.name || 'Unnamed Workspace'}
${workspace.description || 'No description'}
ID: ${workspace.id}
@@ -2712,16 +3248,18 @@ class ControlCenter {
${ownerEmail}
- ${memberCount} member${memberCount !== 1 ? 's' : ''} +
${memberCount} member${memberCount !== 1 ? 's' : ''}
- Active + ${statusInfo.text} -
Total Docs: ${totalDocs}
-
AI Search: ${aiSearchSizeFormatted}
-
Storage: ${storageSizeFormatted}
- ${workspace.activity?.document_metrics?.storage_account_size > 0 ? '
(Enhanced)
' : ''} +
+
Total Docs: ${totalDocs}
+
AI Search: ${aiSearchSizeFormatted}
+
Storage: ${storageSizeFormatted}
+ ${workspace.activity?.document_metrics?.storage_account_size > 0 ? '
(Enhanced)
' : ''} +
+ `} + +
+
+
+ + `; + } + + /** + * Escape HTML to prevent XSS + */ + function escapeHtml(text) { + const div = document.createElement('div'); + div.textContent = text; + return div.innerHTML; + } + + /** + * Render pagination + */ + function renderPagination(page, totalPages, hasMore) { + if (totalPages <= 1) return ''; + + let html = ''; + return html; + } + + /** + * Load and render notifications + */ + function loadNotifications() { + const container = document.getElementById('notifications-container'); + const paginationContainer = document.getElementById('pagination-container'); + const loadingIndicator = document.getElementById('loading-indicator'); + + if (!container) return; + + // Show loading + loadingIndicator.style.display = 'block'; + container.innerHTML = ''; + + // Build query parameters + const params = new URLSearchParams({ + page: currentPage, + per_page: currentPerPage, + include_read: currentFilter !== 'unread', + include_dismissed: false + }); + + fetch(`/api/notifications?${params}`) + .then(response => response.json()) + .then(data => { + loadingIndicator.style.display = 'none'; + + if (!data.success) { + container.innerHTML = '
Failed to load notifications
'; + return; + } + + // Filter by search if needed + let notifications = data.notifications; + + if (currentSearch) { + const searchLower = currentSearch.toLowerCase(); + notifications = notifications.filter(n => + n.title.toLowerCase().includes(searchLower) || + n.message.toLowerCase().includes(searchLower) + ); + } + + // Filter by read status + if (currentFilter === 'read') { + notifications = notifications.filter(n => n.is_read); + } else if (currentFilter === 'unread') { + notifications = notifications.filter(n => !n.is_read); + } + + // Cache notifications for click handlers + cachedNotifications = notifications; + + // Render notifications + if (notifications.length === 0) { + container.innerHTML = ` +
+ +

No notifications

+

You're all caught up!

+
+ `; + } else { + container.innerHTML = notifications.map(renderNotification).join(''); + + // Attach event listeners + attachNotificationListeners(); + } + + // Render pagination + const totalPages = Math.ceil(data.total / currentPerPage); + paginationContainer.innerHTML = renderPagination(currentPage, totalPages, data.has_more); + + // Attach pagination listeners + attachPaginationListeners(); + + // Update badge + pollNotificationCount(); + }) + .catch(error => { + console.error('Error loading notifications:', error); + loadingIndicator.style.display = 'none'; + container.innerHTML = '
Failed to load notifications
'; + }); + } + + /** + * Attach event listeners to notification items + */ + function attachNotificationListeners() { + // Click on notification to view/navigate + document.querySelectorAll('.notification-item').forEach(item => { + item.addEventListener('click', function(e) { + // Don't navigate if clicking action buttons + if (e.target.closest('.mark-read-btn') || e.target.closest('.dismiss-btn')) { + return; + } + + const notificationId = this.dataset.notificationId; + const notification = getNotificationById(notificationId); + + if (notification) { + handleNotificationClick(notification); + } + }); + }); + + // Mark as read buttons + document.querySelectorAll('.mark-read-btn').forEach(btn => { + btn.addEventListener('click', function(e) { + e.stopPropagation(); + const notificationId = this.dataset.notificationId; + markNotificationRead(notificationId); + }); + }); + + // Dismiss buttons + document.querySelectorAll('.dismiss-btn').forEach(btn => { + btn.addEventListener('click', function(e) { + e.stopPropagation(); + const notificationId = this.dataset.notificationId; + dismissNotification(notificationId); + }); + }); + } + + /** + * Attach pagination listeners + */ + function attachPaginationListeners() { + document.querySelectorAll('.page-link').forEach(link => { + link.addEventListener('click', function(e) { + e.preventDefault(); + const page = parseInt(this.dataset.page); + if (page && !isNaN(page)) { + currentPage = page; + loadNotifications(); + } + }); + }); + } + + /** + * Get notification data by ID (stored during render) + */ + let cachedNotifications = []; + + function getNotificationById(id) { + return cachedNotifications.find(n => n.id === id); + } + + /** + * Handle notification click + */ + async function handleNotificationClick(notification) { + // Mark as read + if (!notification.is_read) { + markNotificationRead(notification.id); + } + + // Check if this is a group notification - set active group before navigating + const groupId = notification.metadata?.group_id; + if (groupId && notification.link_url === '/group_workspaces') { + try { + const response = await fetch('/api/groups/setActive', { + method: 'PATCH', + headers: { + 'Content-Type': 'application/json' + }, + body: JSON.stringify({ groupId: groupId }) + }); + + if (!response.ok) { + console.error('Failed to set active group:', await response.text()); + } + } catch (error) { + console.error('Error setting active group:', error); + } + } + + // Navigate if link exists + if (notification.link_url) { + window.location.href = notification.link_url; + } + } + + /** + * Mark notification as read + */ + function markNotificationRead(notificationId) { + fetch(`/api/notifications/${notificationId}/read`, { + method: 'POST', + headers: { + 'Content-Type': 'application/json' + } + }) + .then(response => response.json()) + .then(data => { + if (data.success) { + loadNotifications(); + } + }) + .catch(error => { + console.error('Error marking notification as read:', error); + }); + } + + /** + * Dismiss notification + */ + function dismissNotification(notificationId) { + fetch(`/api/notifications/${notificationId}/dismiss`, { + method: 'DELETE' + }) + .then(response => response.json()) + .then(data => { + if (data.success) { + loadNotifications(); + } + }) + .catch(error => { + console.error('Error dismissing notification:', error); + }); + } + + /** + * Initialize page + */ + function initNotificationsPage() { + // Only run on notifications page + if (!window.location.pathname.includes('/notifications')) { + return; + } + + // Get user's per-page preference + const perPageSelect = document.getElementById('per-page-select'); + if (perPageSelect) { + currentPerPage = parseInt(perPageSelect.value); + + perPageSelect.addEventListener('change', function() { + currentPerPage = parseInt(this.value); + currentPage = 1; + + // Save preference + fetch('/api/notifications/settings', { + method: 'POST', + headers: { + 'Content-Type': 'application/json' + }, + body: JSON.stringify({ + notifications_per_page: currentPerPage + }) + }); + + loadNotifications(); + }); + } + + // Filter buttons + document.querySelectorAll('.filter-btn').forEach(btn => { + btn.addEventListener('click', function() { + document.querySelectorAll('.filter-btn').forEach(b => b.classList.remove('active')); + this.classList.add('active'); + currentFilter = this.dataset.filter; + currentPage = 1; + loadNotifications(); + }); + }); + + // Search input + const searchInput = document.getElementById('search-input'); + if (searchInput) { + let searchTimeout; + searchInput.addEventListener('input', function() { + clearTimeout(searchTimeout); + searchTimeout = setTimeout(() => { + currentSearch = this.value; + currentPage = 1; + loadNotifications(); + }, 500); + }); + } + + // Mark all as read button + const markAllReadBtn = document.getElementById('mark-all-read-btn'); + if (markAllReadBtn) { + markAllReadBtn.addEventListener('click', function() { + fetch('/api/notifications/mark-all-read', { + method: 'POST' + }) + .then(response => response.json()) + .then(data => { + if (data.success) { + loadNotifications(); + } + }) + .catch(error => { + console.error('Error marking all as read:', error); + }); + }); + } + + // Refresh button + const refreshBtn = document.getElementById('refresh-btn'); + if (refreshBtn) { + refreshBtn.addEventListener('click', function() { + loadNotifications(); + }); + } + + // Initial load + loadNotifications(); + } + + // Start polling when page loads (for badge updates) + if (document.readyState === 'loading') { + document.addEventListener('DOMContentLoaded', function() { + pollNotificationCount(); + initNotificationsPage(); + }); + } else { + pollNotificationCount(); + initNotificationsPage(); + } + + // Clean up on page unload + window.addEventListener('beforeunload', function() { + if (pollTimeout) { + clearTimeout(pollTimeout); + } + }); + +})(); diff --git a/application/single_app/static/js/profile-image.js b/application/single_app/static/js/profile-image.js index 37f1f168..7ada361c 100644 --- a/application/single_app/static/js/profile-image.js +++ b/application/single_app/static/js/profile-image.js @@ -17,21 +17,37 @@ let isLoading = false; const sidebar = document.getElementById('sidebar-profile-avatar'); if (topNav && userProfileImage) { + // Preserve notification badge if it exists + const existingBadge = topNav.querySelector('#notification-badge'); + const img = document.createElement('img'); img.src = userProfileImage; img.alt = 'Profile'; img.style.cssText = 'width: 28px; height: 28px; border-radius: 50%; object-fit: cover;'; topNav.innerHTML = ''; topNav.appendChild(img); + + // Re-append badge if it existed + if (existingBadge) { + topNav.appendChild(existingBadge); + } } if (sidebar && userProfileImage) { + // Preserve notification badge if it exists + const existingBadge = sidebar.querySelector('#sidebar-notification-badge'); + const img = document.createElement('img'); img.src = userProfileImage; img.alt = 'Profile'; img.style.cssText = 'width: 32px; height: 32px; border-radius: 50%; object-fit: cover;'; sidebar.innerHTML = ''; sidebar.appendChild(img); + + // Re-append badge if it existed + if (existingBadge) { + sidebar.appendChild(existingBadge); + } } } @@ -151,6 +167,9 @@ function updateTopNavAvatar() { const avatarElement = document.getElementById('top-nav-profile-avatar'); if (!avatarElement) return; + // Preserve notification badge if it exists + const existingBadge = avatarElement.querySelector('#notification-badge'); + if (userProfileImage) { const img = document.createElement('img'); img.src = userProfileImage; @@ -165,6 +184,11 @@ function updateTopNavAvatar() { avatarElement.innerHTML = ''; avatarElement.appendChild(img); avatarElement.style.backgroundColor = 'transparent'; + + // Re-append badge if it existed + if (existingBadge) { + avatarElement.appendChild(existingBadge); + } } else { // Keep the existing initials display, but use cached name if possible const nameElement = avatarElement.parentElement.querySelector('.fw-semibold'); @@ -176,6 +200,11 @@ function updateTopNavAvatar() { avatarElement.style.width = '28px'; avatarElement.style.height = '28px'; avatarElement.style.backgroundColor = '#6c757d'; + + // Re-append badge if it existed + if (existingBadge) { + avatarElement.appendChild(existingBadge); + } } } } @@ -187,6 +216,9 @@ function updateSidebarAvatar() { const sidebarAvatar = document.getElementById('sidebar-profile-avatar'); if (!sidebarAvatar) return; + // Preserve notification badge if it exists + const existingBadge = sidebarAvatar.querySelector('#sidebar-notification-badge'); + if (userProfileImage) { const img = document.createElement('img'); img.src = userProfileImage; @@ -201,6 +233,11 @@ function updateSidebarAvatar() { sidebarAvatar.innerHTML = ''; sidebarAvatar.appendChild(img); sidebarAvatar.style.backgroundColor = 'transparent'; + + // Re-append badge if it existed + if (existingBadge) { + sidebarAvatar.appendChild(existingBadge); + } } else { // Get initials for sidebar const nameElement = document.querySelector('#sidebar-user-account .fw-semibold'); @@ -212,6 +249,11 @@ function updateSidebarAvatar() { sidebarAvatar.style.width = '28px'; sidebarAvatar.style.height = '28px'; sidebarAvatar.style.backgroundColor = '#6c757d'; + + // Re-append badge if it existed + if (existingBadge) { + sidebarAvatar.appendChild(existingBadge); + } } } } diff --git a/application/single_app/static/js/public/manage_public_workspace.js b/application/single_app/static/js/public/manage_public_workspace.js index 402a55b8..ba1f5b09 100644 --- a/application/single_app/static/js/public/manage_public_workspace.js +++ b/application/single_app/static/js/public/manage_public_workspace.js @@ -12,6 +12,20 @@ $(document).ready(function () { loadMembers(); }); + // Initialize color picker + initializeColorPicker(); + + // Load stats when stats tab is shown + $('#stats-tab').on('shown.bs.tab', function () { + loadWorkspaceStats(); + }); + + // Activity timeline pagination + $('input[name="activityLimit"]').on('change', function() { + const limit = parseInt($(this).val()); + loadActivityTimeline(limit); + }); + // Edit workspace form (Owner only) $("#editWorkspaceForm").on("submit", function (e) { e.preventDefault(); @@ -139,6 +153,82 @@ $(document).ready(function () { $("#pendingRequestsTable").on("click", ".reject-request-btn", function () { rejectRequest($(this).data("id")); }); + + // CSV Bulk Upload Events + $("#addBulkMemberBtn").on("click", function () { + $("#csvBulkUploadModal").modal("show"); + }); + + $("#csvExampleBtn").on("click", downloadCsvExample); + $("#csvConfigBtn").on("click", showCsvConfig); + $("#csvFileInput").on("change", handleCsvFileSelect); + $("#csvNextBtn").on("click", startCsvUpload); + $("#csvDoneBtn").on("click", function () { + resetCsvModal(); + loadMembers(); + }); + + // Reset CSV modal when closed + $("#csvBulkUploadModal").on("hidden.bs.modal", function () { + resetCsvModal(); + }); + + // Bulk Actions Events + $("#selectAllMembers").on("change", function () { + const isChecked = $(this).prop("checked"); + $(".member-checkbox").prop("checked", isChecked); + updateBulkActionsBar(); + }); + + $(document).on("change", ".member-checkbox", function () { + updateBulkActionsBar(); + updateSelectAllCheckbox(); + }); + + $("#clearSelectionBtn").on("click", function () { + $(".member-checkbox").prop("checked", false); + $("#selectAllMembers").prop("checked", false); + updateBulkActionsBar(); + }); + + $("#bulkAssignRoleBtn").on("click", function () { + const selectedMembers = getSelectedMembers(); + if (selectedMembers.length === 0) { + alert("Please select at least one member"); + return; + } + $("#bulkRoleCount").text(selectedMembers.length); + $("#bulkAssignRoleModal").modal("show"); + }); + + $("#bulkAssignRoleForm").on("submit", function (e) { + e.preventDefault(); + bulkAssignRole(); + }); + + $("#bulkRemoveMembersBtn").on("click", function () { + const selectedMembers = getSelectedMembers(); + if (selectedMembers.length === 0) { + alert("Please select at least one member"); + return; + } + + // Populate the list of members to be removed + let membersList = "
    "; + selectedMembers.forEach(member => { + membersList += `
  • • ${member.name} (${member.email})
  • `; + }); + membersList += "
"; + + $("#bulkRemoveCount").text(selectedMembers.length); + $("#bulkRemoveMembersList").html(membersList); + $("#bulkRemoveMembersModal").modal("show"); + }); + + $("#bulkRemoveMembersForm").on("submit", function (e) { + e.preventDefault(); + bulkRemoveMembers(); + }); }); @@ -148,16 +238,14 @@ $(document).ready(function () { function loadWorkspaceInfo(callback) { $.get(`/api/public_workspaces/${workspaceId}`) .done(function (ws) { + // Update status alert + updateWorkspaceStatusAlert(ws); const owner = ws.owner || {}; const admins = ws.admins || []; const docMgrs = ws.documentManagers || []; - // Header info - $("#workspaceInfoContainer").html(` -

${ws.name}

-

${ws.description || ""}

-

Owner: ${owner.displayName} (${owner.email})

- `); + // Update profile hero + updateProfileHero(ws, owner); // Determine role if (userId === owner.userId) { @@ -174,12 +262,25 @@ function loadWorkspaceInfo(callback) { $("#editWorkspaceContainer").show(); $("#editWorkspaceName").val(ws.name); $("#editWorkspaceDescription").val(ws.description); + + // Set selected color + const color = ws.heroColor || '#0078d4'; + $("#selectedColor").val(color); + updateHeroColor(color); + $(`.color-option[data-color="${color}"]`).addClass('selected'); + } + + // Show member actions for non-owners + if (currentUserRole !== "Owner" && currentUserRole) { + $("#memberActionsContainer").show(); } // Admin & Owner UI if (currentUserRole === "Owner" || currentUserRole === "Admin") { $("#addMemberBtn").show(); + $("#addBulkMemberBtn").show(); $("#pendingRequestsSection").show(); + $("#activityTimelineSection").show(); loadPendingRequests(); } @@ -194,7 +295,8 @@ function loadWorkspaceInfo(callback) { function updateWorkspaceInfo() { const data = { name: $("#editWorkspaceName").val().trim(), - description: $("#editWorkspaceDescription").val().trim() + description: $("#editWorkspaceDescription").val().trim(), + heroColor: $("#selectedColor").val() }; $.ajax({ url: `/api/public_workspaces/${workspaceId}`, @@ -223,8 +325,18 @@ function loadMembers(searchTerm = "", roleFilter = "") { $.get(url) .done(function (members) { const rows = members.map(m => { + const isOwner = m.role === "Owner"; + const checkboxHtml = isOwner || (currentUserRole !== "Owner" && currentUserRole !== "Admin") + ? '' + : ``; + return ` + ${checkboxHtml} ${m.displayName || "(no name)"}
${m.email || ""} @@ -235,10 +347,14 @@ function loadMembers(searchTerm = "", roleFilter = "") { `; }).join(""); $("#membersTable tbody").html(rows); + + // Reset selection UI + $("#selectAllMembers").prop("checked", false); + updateBulkActionsBar(); }) .fail(function () { $("#membersTable tbody").html( - `Failed to load members.` + `Failed to load members.` ); }); } @@ -436,3 +552,743 @@ function addMemberDirectly() { } }); } + +// --- New Functions for Profile Hero and Stats --- + +// Update profile hero section +function updateProfileHero(workspace, owner) { + const initial = workspace.name ? workspace.name.charAt(0).toUpperCase() : 'W'; + $('#workspaceInitial').text(initial); + $('#workspaceHeroName').text(workspace.name || 'Unnamed Workspace'); + $('#workspaceOwnerName').text(owner.displayName || 'Unknown'); + $('#workspaceOwnerEmail').text(owner.email || 'N/A'); + $('#workspaceHeroDescription').text(workspace.description || 'No description provided'); + + // Apply hero color + const color = workspace.heroColor || '#0078d4'; + updateHeroColor(color); +} + +// Update hero color +function updateHeroColor(color) { + const darker = adjustColorBrightness(color, -30); + document.documentElement.style.setProperty('--hero-color', color); + document.documentElement.style.setProperty('--hero-color-dark', darker); +} + +// Adjust color brightness +function adjustColorBrightness(color, percent) { + const num = parseInt(color.replace('#', ''), 16); + const amt = Math.round(2.55 * percent); + const R = (num >> 16) + amt; + const G = (num >> 8 & 0x00FF) + amt; + const B = (num & 0x0000FF) + amt; + return '#' + (0x1000000 + (R < 255 ? R < 1 ? 0 : R : 255) * 0x10000 + + (G < 255 ? G < 1 ? 0 : G : 255) * 0x100 + + (B < 255 ? B < 1 ? 0 : B : 255)) + .toString(16).slice(1); +} + +// Initialize color picker +function initializeColorPicker() { + $('.color-option').on('click', function() { + $('.color-option').removeClass('selected'); + $(this).addClass('selected'); + const color = $(this).data('color'); + $('#selectedColor').val(color); + updateHeroColor(color); + }); +} + +// Load workspace stats +let documentChart, storageChart, tokenChart; + +function loadWorkspaceStats() { + // Load stats data + $.get(`/api/public_workspaces/${workspaceId}/stats`) + .done(function(stats) { + updateStatCards(stats); + updateCharts(stats); + // Load activity timeline if user has permission + if (currentUserRole === "Owner" || currentUserRole === "Admin") { + loadActivityTimeline(50); + } + }) + .fail(function() { + console.error('Failed to load workspace stats'); + $('#stat-documents').text('N/A'); + $('#stat-storage').text('N/A'); + $('#stat-tokens').text('N/A'); + $('#stat-members').text('N/A'); + }); +} + +// Update stat cards +function updateStatCards(stats) { + $('#stat-documents').text(stats.totalDocuments || 0); + $('#stat-storage').text(formatBytes(stats.storageUsed || 0)); + $('#stat-tokens').text(formatNumber(stats.totalTokens || 0)); + $('#stat-members').text(stats.totalMembers || 0); +} + +// Update charts +function updateCharts(stats) { + // Document Activity Chart - Two bars for uploads and deletes + const docCtx = document.getElementById('documentChart'); + if (docCtx) { + if (documentChart) documentChart.destroy(); + documentChart = new Chart(docCtx, { + type: 'bar', + data: { + labels: stats.documentActivity?.labels || [], + datasets: [ + { + label: 'Uploads', + data: stats.documentActivity?.uploads || [], + backgroundColor: 'rgba(13, 202, 240, 0.8)', + borderColor: 'rgb(13, 202, 240)', + borderWidth: 1 + }, + { + label: 'Deletes', + data: stats.documentActivity?.deletes || [], + backgroundColor: 'rgba(220, 53, 69, 0.8)', + borderColor: 'rgb(220, 53, 69)', + borderWidth: 1 + } + ] + }, + options: { + responsive: true, + maintainAspectRatio: false, + plugins: { + legend: { + display: true, + position: 'top' + } + }, + scales: { + y: { + beginAtZero: true, + ticks: { precision: 0 } + } + } + } + }); + } + + // Storage Usage Chart (Doughnut) - AI Search and Blob Storage + const storageCtx = document.getElementById('storageChart'); + if (storageCtx) { + if (storageChart) storageChart.destroy(); + const aiSearch = stats.storage?.ai_search_size || 0; + const blobStorage = stats.storage?.storage_account_size || 0; + + storageChart = new Chart(storageCtx, { + type: 'doughnut', + data: { + labels: ['AI Search', 'Blob Storage'], + datasets: [{ + data: [aiSearch, blobStorage], + backgroundColor: [ + 'rgb(13, 110, 253)', + 'rgb(23, 162, 184)' + ], + borderWidth: 2 + }] + }, + options: { + responsive: true, + maintainAspectRatio: false, + plugins: { + legend: { position: 'bottom' }, + tooltip: { + callbacks: { + label: function(context) { + return context.label + ': ' + formatBytes(context.parsed); + } + } + } + } + } + }); + } + + // Token Usage Chart + const tokenCtx = document.getElementById('tokenChart'); + if (tokenCtx) { + if (tokenChart) tokenChart.destroy(); + tokenChart = new Chart(tokenCtx, { + type: 'bar', + data: { + labels: stats.tokenUsage?.labels || [], + datasets: [{ + label: 'Tokens Used', + data: stats.tokenUsage?.data || [], + backgroundColor: 'rgba(255, 193, 7, 0.7)', + borderColor: 'rgb(255, 193, 7)', + borderWidth: 1 + }] + }, + options: { + responsive: true, + maintainAspectRatio: false, + plugins: { + legend: { display: false } + }, + scales: { + y: { + beginAtZero: true, + ticks: { + callback: function(value) { + return formatNumber(value); + } + } + } + } + } + }); + } +} + +// Load activity timeline +function loadActivityTimeline(limit = 50) { + $.get(`/api/public_workspaces/${workspaceId}/activity?limit=${limit}`) + .done(function(activities) { + if (!activities || activities.length === 0) { + $('#activityTimeline').html('

No recent activity

'); + return; + } + + const html = activities.map(activity => renderActivityItem(activity)).join(''); + $('#activityTimeline').html(html); + }) + .fail(function(xhr) { + if (xhr.status === 403) { + $('#activityTimeline').html('

Access denied - Only workspace owners and admins can view activity timeline

'); + } else { + $('#activityTimeline').html('

Failed to load activity

'); + } + }); +} + +// Render activity item +function renderActivityItem(activity) { + const icons = { + 'document_creation': 'file-earmark-arrow-up', + 'document_deletion': 'file-earmark-x', + 'token_usage': 'cpu', + 'user_login': 'box-arrow-in-right' + }; + + const colors = { + 'document_creation': 'success', + 'document_deletion': 'danger', + 'token_usage': 'primary', + 'user_login': 'info' + }; + + const activityType = activity.activity_type || 'unknown'; + const icon = icons[activityType] || 'circle'; + const color = colors[activityType] || 'secondary'; + const time = formatRelativeTime(activity.timestamp || activity.created_at); + + // Generate description based on activity type + let description = ''; + let title = activityType.replace(/_/g, ' ').replace(/\b\w/g, l => l.toUpperCase()); + + if (activityType === 'document_creation' && activity.document) { + description = `File: ${activity.document.file_name || 'Unknown'}`; + } else if (activityType === 'document_deletion' && activity.document_metadata) { + description = `File: ${activity.document_metadata.file_name || 'Unknown'}`; + } else if (activityType === 'token_usage' && activity.usage) { + description = `Tokens: ${formatNumber(activity.usage.total_tokens || 0)}`; + } else if (activityType === 'user_login') { + description = 'User logged in'; + } + + const activityJson = JSON.stringify(activity); + + return ` +
+
+
+ +
+
+
+
${title}
+ ${time} +
+

${description}

+
+
+
+ `; +} + +// Format bytes +function formatBytes(bytes) { + if (bytes === 0) return '0 B'; + const k = 1024; + const sizes = ['B', 'KB', 'MB', 'GB', 'TB']; + const i = Math.floor(Math.log(bytes) / Math.log(k)); + return Math.round(bytes / Math.pow(k, i) * 100) / 100 + ' ' + sizes[i]; +} + +// Format number with commas +function formatNumber(num) { + return num.toString().replace(/\B(?=(\d{3})+(?!\d))/g, ','); +} + +// Show raw activity in modal +function showRawActivity(element) { + try { + const activityJson = element.getAttribute('data-activity'); + const activity = JSON.parse(activityJson); + const modalBody = document.getElementById('rawActivityModalBody'); + modalBody.innerHTML = `
${JSON.stringify(activity, null, 2)}
`; + $('#rawActivityModal').modal('show'); + } catch (error) { + console.error('Error showing raw activity:', error); + } +} + +// Copy raw activity to clipboard +function copyRawActivityToClipboard() { + const modalBody = document.getElementById('rawActivityModalBody'); + const text = modalBody.textContent; + navigator.clipboard.writeText(text).then(() => { + alert('Activity data copied to clipboard!'); + }).catch(err => { + console.error('Failed to copy:', err); + }); +} + +// Make functions globally available +window.showRawActivity = showRawActivity; +window.copyRawActivityToClipboard = copyRawActivityToClipboard; + +// Format relative time +function formatRelativeTime(timestamp) { + const now = new Date(); + const date = new Date(timestamp); + const diffMs = now - date; + const diffMins = Math.floor(diffMs / 60000); + const diffHours = Math.floor(diffMs / 3600000); + const diffDays = Math.floor(diffMs / 86400000); + + if (diffMins < 1) return 'Just now'; + if (diffMins < 60) return `${diffMins}m ago`; + if (diffHours < 24) return `${diffHours}h ago`; + if (diffDays < 7) return `${diffDays}d ago`; + return date.toLocaleDateString(); +} + +// ============================================================================ +// CSV Bulk Member Upload Functions +// ============================================================================ + +let csvParsedData = []; + +function downloadCsvExample() { + const csvContent = `userId,displayName,email,role +00000000-0000-0000-0000-000000000001,John Smith,john.smith@contoso.com,user +00000000-0000-0000-0000-000000000002,Jane Doe,jane.doe@contoso.com,admin +00000000-0000-0000-0000-000000000003,Bob Johnson,bob.johnson@contoso.com,document_manager`; + + const blob = new Blob([csvContent], { type: 'text/csv' }); + const url = window.URL.createObjectURL(blob); + const a = document.createElement('a'); + a.href = url; + a.download = 'bulk_members_example.csv'; + document.body.appendChild(a); + a.click(); + document.body.removeChild(a); + window.URL.revokeObjectURL(url); +} + +function showCsvConfig() { + const modal = new bootstrap.Modal(document.getElementById('csvFormatInfoModal')); + modal.show(); +} + +function validateGuid(guid) { + return ValidationUtils.validateGuid(guid); +} + +function validateEmail(email) { + const emailRegex = /^[^\s@]+@[^\s@]+\.[^\s@]+$/; + return emailRegex.test(email); +} + +function handleCsvFileSelect(event) { + const file = event.target.files[0]; + if (!file) { + $("#csvNextBtn").prop("disabled", true); + $("#csvValidationResults").hide(); + $("#csvErrorDetails").hide(); + return; + } + + const reader = new FileReader(); + reader.onload = function (e) { + const text = e.target.result; + const lines = text.split(/\r?\n/).filter(line => line.trim()); + + $("#csvErrorDetails").hide(); + $("#csvValidationResults").hide(); + + // Validate header + if (lines.length < 2) { + showCsvError("CSV must contain at least a header row and one data row"); + return; + } + + const header = lines[0].toLowerCase().trim(); + if (header !== "userid,displayname,email,role") { + showCsvError("Invalid header. Expected: userId,displayName,email,role"); + return; + } + + // Validate row count + const dataRows = lines.slice(1); + if (dataRows.length > 1000) { + showCsvError(`Too many rows. Maximum 1,000 members allowed (found ${dataRows.length})`); + return; + } + + // Parse and validate rows + csvParsedData = []; + const errors = []; + const validRoles = ['user', 'admin', 'document_manager']; + + for (let i = 0; i < dataRows.length; i++) { + const rowNum = i + 2; // +2 because header is row 1 + const row = dataRows[i].split(','); + + if (row.length !== 4) { + errors.push(`Row ${rowNum}: Expected 4 columns, found ${row.length}`); + continue; + } + + const userId = row[0].trim(); + const displayName = row[1].trim(); + const email = row[2].trim(); + const role = row[3].trim().toLowerCase(); + + if (!userId || !displayName || !email || !role) { + errors.push(`Row ${rowNum}: All fields are required`); + continue; + } + + if (!validateGuid(userId)) { + errors.push(`Row ${rowNum}: Invalid GUID format for userId`); + continue; + } + + if (!validateEmail(email)) { + errors.push(`Row ${rowNum}: Invalid email format`); + continue; + } + + if (!validRoles.includes(role)) { + errors.push(`Row ${rowNum}: Invalid role '${role}'. Must be: user, admin, or document_manager`); + continue; + } + + csvParsedData.push({ userId, displayName, email, role }); + } + + if (errors.length > 0) { + showCsvError(`Found ${errors.length} validation error(s):\n` + errors.slice(0, 10).join('\n') + + (errors.length > 10 ? `\n... and ${errors.length - 10} more` : '')); + return; + } + + // Show validation success + const sampleRows = csvParsedData.slice(0, 3); + $("#csvValidationDetails").html(` +

✓ Valid CSV file detected

+

Total members to add: ${csvParsedData.length}

+

Sample data (first 3):

+
    + ${sampleRows.map(row => `
  • ${row.displayName} (${row.email})
  • `).join('')} +
+ `); + $("#csvValidationResults").show(); + $("#csvNextBtn").prop("disabled", false); + }; + + reader.readAsText(file); +} + +function showCsvError(message) { + $("#csvErrorList").html(`
${escapeHtml(message)}
`); + $("#csvErrorDetails").show(); + $("#csvNextBtn").prop("disabled", true); + csvParsedData = []; +} + +function startCsvUpload() { + if (csvParsedData.length === 0) { + alert("No valid data to upload"); + return; + } + + // Switch to stage 2 + $("#csvStage1").hide(); + $("#csvStage2").show(); + $("#csvNextBtn").hide(); + $("#csvCancelBtn").hide(); + $("#csvModalClose").hide(); + + // Upload members + uploadCsvMembers(); +} + +async function uploadCsvMembers() { + let successCount = 0; + let failedCount = 0; + let skippedCount = 0; + const failures = []; + + for (let i = 0; i < csvParsedData.length; i++) { + const member = csvParsedData[i]; + const progress = Math.round(((i + 1) / csvParsedData.length) * 100); + + updateCsvProgress(progress, `Processing ${i + 1} of ${csvParsedData.length}: ${member.displayName}`); + + try { + const response = await fetch(`/api/public_workspaces/${workspaceId}/members`, { + method: 'POST', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ + userId: member.userId, + displayName: member.displayName, + email: member.email, + role: member.role + }) + }); + + const data = await response.json(); + + if (response.ok && data.success) { + successCount++; + } else if (data.error && data.error.includes('already a member')) { + skippedCount++; + } else { + failedCount++; + failures.push(`${member.displayName}: ${data.error || 'Unknown error'}`); + } + } catch (error) { + failedCount++; + failures.push(`${member.displayName}: ${error.message}`); + } + } + + // Show summary + showCsvSummary(successCount, failedCount, skippedCount, failures); +} + +function updateCsvProgress(percentage, statusText) { + $("#csvProgressBar").css("width", percentage + "%"); + $("#csvProgressBar").attr("aria-valuenow", percentage); + $("#csvProgressText").text(percentage + "%"); + $("#csvStatusText").text(statusText); +} + +function showCsvSummary(successCount, failedCount, skippedCount, failures) { + $("#csvStage2").hide(); + $("#csvStage3").show(); + $("#csvDoneBtn").show(); + + let summaryHtml = ` +

Upload Summary:

+
    +
  • ✅ Successfully added: ${successCount}
  • +
  • ⏭️ Skipped (already members): ${skippedCount}
  • +
  • ❌ Failed: ${failedCount}
  • +
+ `; + + if (failures.length > 0) { + summaryHtml += ` +
+

Failed Members:

+
    + ${failures.slice(0, 10).map(f => `
  • ${escapeHtml(f)}
  • `).join('')} + ${failures.length > 10 ? `
  • ... and ${failures.length - 10} more
  • ` : ''} +
+ `; + } + + $("#csvSummary").html(summaryHtml); +} + +function resetCsvModal() { + // Reset to stage 1 + $("#csvStage1").show(); + $("#csvStage2").hide(); + $("#csvStage3").hide(); + $("#csvNextBtn").show(); + $("#csvNextBtn").prop("disabled", true); + $("#csvCancelBtn").show(); + $("#csvDoneBtn").hide(); + $("#csvModalClose").show(); + $("#csvValidationResults").hide(); + $("#csvErrorDetails").hide(); + $("#csvFileInput").val(''); + csvParsedData = []; + + // Reset progress + updateCsvProgress(0, 'Ready'); +} + +// ============================================================================ +// Bulk Member Actions Functions +// ============================================================================ + +function getSelectedMembers() { + const selected = []; + $(".member-checkbox:checked").each(function () { + selected.push({ + userId: $(this).data("user-id"), + name: $(this).data("user-name"), + email: $(this).data("user-email"), + role: $(this).data("user-role") + }); + }); + return selected; +} + +function updateBulkActionsBar() { + const selectedCount = $(".member-checkbox:checked").length; + if (selectedCount > 0) { + $("#selectedCount").text(selectedCount); + $("#bulkActionsBar").show(); + } else { + $("#bulkActionsBar").hide(); + } +} + +function updateSelectAllCheckbox() { + const totalCheckboxes = $(".member-checkbox").length; + const checkedCheckboxes = $(".member-checkbox:checked").length; + + if (totalCheckboxes > 0 && checkedCheckboxes === totalCheckboxes) { + $("#selectAllMembers").prop("checked", true); + $("#selectAllMembers").prop("indeterminate", false); + } else if (checkedCheckboxes > 0) { + $("#selectAllMembers").prop("checked", false); + $("#selectAllMembers").prop("indeterminate", true); + } else { + $("#selectAllMembers").prop("checked", false); + $("#selectAllMembers").prop("indeterminate", false); + } +} + +async function bulkAssignRole() { + const selectedMembers = getSelectedMembers(); + const newRole = $("#bulkRoleSelect").val(); + + if (selectedMembers.length === 0) { + alert("No members selected"); + return; + } + + // Close modal and show progress + $("#bulkAssignRoleModal").modal("hide"); + + let successCount = 0; + let failedCount = 0; + const failures = []; + + for (let i = 0; i < selectedMembers.length; i++) { + const member = selectedMembers[i]; + + try { + const response = await fetch(`/api/public_workspaces/${workspaceId}/members/${member.userId}`, { + method: 'PATCH', + headers: { 'Content-Type': 'application/json' }, + body: JSON.stringify({ role: newRole }) + }); + + const data = await response.json(); + + if (response.ok && data.success) { + successCount++; + } else { + failedCount++; + failures.push(`${member.name}: ${data.error || 'Unknown error'}`); + } + } catch (error) { + failedCount++; + failures.push(`${member.name}: ${error.message}`); + } + } + + // Show summary + let message = `Role assignment complete:\n✅ Success: ${successCount}\n❌ Failed: ${failedCount}`; + if (failures.length > 0) { + message += "\n\nFailed members:\n" + failures.slice(0, 5).join("\n"); + if (failures.length > 5) { + message += `\n... and ${failures.length - 5} more`; + } + } + alert(message); + + // Reload members and clear selection + loadMembers(); +} + +async function bulkRemoveMembers() { + const selectedMembers = getSelectedMembers(); + + if (selectedMembers.length === 0) { + alert("No members selected"); + return; + } + + // Close modal + $("#bulkRemoveMembersModal").modal("hide"); + + let successCount = 0; + let failedCount = 0; + const failures = []; + + for (let i = 0; i < selectedMembers.length; i++) { + const member = selectedMembers[i]; + + try { + const response = await fetch(`/api/public_workspaces/${workspaceId}/members/${member.userId}`, { + method: 'DELETE' + }); + + const data = await response.json(); + + if (response.ok && data.success) { + successCount++; + } else { + failedCount++; + failures.push(`${member.name}: ${data.error || 'Unknown error'}`); + } + } catch (error) { + failedCount++; + failures.push(`${member.name}: ${error.message}`); + } + } + + // Show summary + let message = `Member removal complete:\n✅ Success: ${successCount}\n❌ Failed: ${failedCount}`; + if (failures.length > 0) { + message += "\n\nFailed removals:\n" + failures.slice(0, 5).join("\n"); + if (failures.length > 5) { + message += `\n... and ${failures.length - 5} more`; + } + } + alert(message); + + // Reload members and clear selection + loadMembers(); +} diff --git a/application/single_app/static/js/public/my_public_workspaces.js b/application/single_app/static/js/public/my_public_workspaces.js index 7123d4b7..21e6d0ef 100644 --- a/application/single_app/static/js/public/my_public_workspaces.js +++ b/application/single_app/static/js/public/my_public_workspaces.js @@ -2,7 +2,7 @@ $(document).ready(function () { // Grab global active workspace ID (set via inline @@ -3031,6 +3512,7 @@
Security Considerat if (!simplemde) { simplemde = new SimpleMDE({ element: document.getElementById("landing_page_text_editor"), + autoDownloadFontAwesome: false, // Optionally add any SimpleMDE config here: // placeholder: "Enter your landing page markdown..." }); @@ -3158,14 +3640,36 @@
Security Considerat // Handle audio support toggle in main form const audioSupportToggle = document.getElementById('enable_audio_file_support'); + const speechInputToggle = document.getElementById('enable_speech_to_text_input'); + const ttsToggle = document.getElementById('enable_text_to_speech'); + + function updateAudioServiceSettings() { + const audioServiceSettings = document.getElementById('audio_service_settings'); + const audioEnabled = audioSupportToggle?.checked || false; + const speechInputEnabled = speechInputToggle?.checked || false; + const ttsEnabled = ttsToggle?.checked || false; + + if (audioServiceSettings) { + if (audioEnabled || speechInputEnabled || ttsEnabled) { + audioServiceSettings.style.display = 'block'; + } else { + audioServiceSettings.style.display = 'none'; + } + } + + updateWalkthroughRequirements(); + } + if (audioSupportToggle) { - audioSupportToggle.addEventListener('change', function() { - updateWalkthroughRequirements(); - - // Recalculate available steps and refresh navigation - const currentStep = getCurrentWalkthroughStep(); - navigateToWalkthroughStep(currentStep); - }); + audioSupportToggle.addEventListener('change', updateAudioServiceSettings); + } + + if (speechInputToggle) { + speechInputToggle.addEventListener('change', updateAudioServiceSettings); + } + + if (ttsToggle) { + ttsToggle.addEventListener('change', updateAudioServiceSettings); } // Handle Front Door toggle in main form @@ -3401,6 +3905,106 @@
Security Considerat updateTimerLimits('file'); }); + // Retention Policy Functions + function showManualExecutionModal() { + const modal = new bootstrap.Modal(document.getElementById('manualExecutionModal')); + + // Reset modal state + document.getElementById('execution-status').style.display = 'none'; + document.getElementById('execution-results').style.display = 'none'; + document.getElementById('manual_exec_personal').checked = false; + document.getElementById('manual_exec_group').checked = false; + document.getElementById('manual_exec_public').checked = false; + document.getElementById('execute-btn').disabled = false; + + modal.show(); + } + + function executeRetentionPolicy() { + const personal = document.getElementById('manual_exec_personal').checked; + const group = document.getElementById('manual_exec_group').checked; + const publicWs = document.getElementById('manual_exec_public').checked; + + const scopes = []; + if (personal) scopes.push('personal'); + if (group) scopes.push('group'); + if (publicWs) scopes.push('public'); + + if (scopes.length === 0) { + alert('Please select at least one workspace scope.'); + return; + } + + // Confirm before executing + if (!confirm(`Are you sure you want to execute retention policy for ${scopes.join(', ')} workspaces? This will delete aged items according to configured retention periods.`)) { + return; + } + + // Show processing status + document.getElementById('execution-status').style.display = 'block'; + document.getElementById('execution-status-text').textContent = 'Processing...'; + document.getElementById('execution-results').style.display = 'none'; + document.getElementById('execute-btn').disabled = true; + + // Execute via API + fetch('/api/admin/retention-policy/execute', { + method: 'POST', + headers: { + 'Content-Type': 'application/json' + }, + body: JSON.stringify({ scopes: scopes }) + }) + .then(response => response.json()) + .then(data => { + document.getElementById('execution-status').style.display = 'none'; + document.getElementById('execution-results').style.display = 'block'; + + if (data.success) { + const results = data.results; + let html = '
Execution completed successfully!
'; + + html += '
'; + html += ''; + html += ''; + + if (scopes.includes('personal')) { + html += ``; + } + if (scopes.includes('group')) { + html += ``; + } + if (scopes.includes('public')) { + html += ``; + } + + html += '
Workspace TypeConversations DeletedDocuments DeletedWorkspaces/Users Affected
Personal${results.personal.conversations}${results.personal.documents}${results.personal.users_affected} users
Group${results.group.conversations}${results.group.documents}${results.group.workspaces_affected} groups
Public${results.public.conversations}${results.public.documents}${results.public.workspaces_affected} workspaces
'; + html += '

Affected users/owners will receive notifications with details of deleted items.

'; + + document.getElementById('results-content').innerHTML = html; + } else { + document.getElementById('results-content').innerHTML = ` +
+ + Execution failed: ${data.error || 'Unknown error'} +
+ `; + } + + document.getElementById('execute-btn').disabled = false; + }) + .catch(error => { + document.getElementById('execution-status').style.display = 'none'; + document.getElementById('execution-results').style.display = 'block'; + document.getElementById('results-content').innerHTML = ` +
+ + Error: ${error.message} +
+ `; + document.getElementById('execute-btn').disabled = false; + }); + } + {% endblock %} diff --git a/application/single_app/templates/approvals.html b/application/single_app/templates/approvals.html new file mode 100644 index 00000000..a3388f23 --- /dev/null +++ b/application/single_app/templates/approvals.html @@ -0,0 +1,669 @@ +{% extends "base.html" %} + +{% block title %}Approval Requests - {{ app_settings.app_title }}{% endblock %} + +{% block content %} +
+
+
+
+

+ Approval Requests +

+ +
+ + +
+
+
+ + +
+
+
+ +
+
+ +
+
+ +
+
+ + +
+
+
+ + + + + + + + + + + + + + + + +
Request TypeGroup NameRequested ByCreatedStatusActions
+
+ Loading... +
+
Loading approvals...
+
+
+ + +
+
+ +
+ +
+
+
+
+
+
+ + + +{% endblock %} + +{% block scripts %} + +{% endblock %} diff --git a/application/single_app/templates/base.html b/application/single_app/templates/base.html index 2696aa09..614f7870 100644 --- a/application/single_app/templates/base.html +++ b/application/single_app/templates/base.html @@ -215,7 +215,7 @@ {% if app_settings.classification_banner_enabled and app_settings.classification_banner_text %}
+ style="background: {{ app_settings.classification_banner_color or '#ffc107' }}; color: {{ app_settings.classification_banner_text_color or '#ffffff' }}; display: flex; align-items: center; justify-content: center; text-align: center; font-weight: bold; padding: 0; height: 40px; font-size: 1.1em; letter-spacing: 0.5px; position: fixed; top: 0; left: 0; width: 100%; z-index: 1051;"> {{ app_settings.classification_banner_text }}
{% endif %} @@ -336,6 +336,8 @@ + + {% block scripts %}{% endblock %} diff --git a/application/single_app/templates/chats.html b/application/single_app/templates/chats.html index 0ec65b1d..ec7c8242 100644 --- a/application/single_app/templates/chats.html +++ b/application/single_app/templates/chats.html @@ -5,6 +5,9 @@ {% block head %} + {% if app_settings.enable_speech_to_text_input %} + + {% endif %} {% endblock %} @@ -560,8 +644,9 @@
Activity Trends - Real-time, does not require refresh +
+ Real-time, does not require refresh
@@ -989,10 +1074,18 @@
- - + + + + + + + + + +
@@ -1146,6 +1239,9 @@
File Upload Control
+ + + @@ -1473,9 +1604,20 @@
Group Ownership
+ @@ -1496,7 +1638,7 @@
Member Management
Add/remove members and assign roles
- See detailed group activity timeline @@ -1507,6 +1649,49 @@
Member Management
+ + {% if app_settings.enable_retention_policy_group %} +
+
+
Retention Policy
+
+
+
+ + Configure automatic deletion of aged conversations and documents. Set to "No automatic deletion" to keep items indefinitely. +
+
+
+ + +
+
+ + +
+
+
+
+
+
+ {% endif %} +
@@ -1649,13 +1834,16 @@
- - - - - - - - - - - -
NameRoleActions
- - - + +
+ +
+
+
+
+ +
+
-
+
Total Documents
+
+
+
+
+
+ +
+
-
+
Storage Used
+
+
+
+
+
+ +
+
-
+
Total Tokens
+
+
+
+
+
+ +
+
-
+
Total Members
+
+
+
+ + +
+
+
+
Document Activity (Last 30 Days)
+
+ +
+
+
+
+
+
Storage Usage
+
+ +
+
+
+
+ +
+
+
+
Token Usage (Last 30 Days)
+
+ +
+
+
+
+ + + +
+ - + + + + + + + + + + + +