Files
ASE/src/utils/general.py
alex 82b563e5ed feat: implement security fixes, async migration, and performance optimizations
This comprehensive update addresses critical security vulnerabilities,
migrates to fully async architecture, and implements performance optimizations.

## Security Fixes (CRITICAL)
- Fixed 9 SQL injection vulnerabilities using parameterized queries:
  * loader_action.py: 4 queries (update_workflow_status functions)
  * action_query.py: 2 queries (get_tool_info, get_elab_timestamp)
  * nodes_query.py: 1 query (get_nodes)
  * data_preparation.py: 1 query (prepare_elaboration)
  * file_management.py: 1 query (on_file_received)
  * user_admin.py: 4 queries (SITE commands)

## Async Migration
- Replaced blocking I/O with async equivalents:
  * general.py: sync file I/O → aiofiles
  * send_email.py: sync SMTP → aiosmtplib
  * file_management.py: mysql-connector → aiomysql
  * user_admin.py: complete rewrite with async + sync wrappers
  * connection.py: added connetti_db_async()

- Updated dependencies in pyproject.toml:
  * Added: aiomysql, aiofiles, aiosmtplib
  * Moved mysql-connector-python to [dependency-groups.legacy]

## Graceful Shutdown
- Implemented signal handlers for SIGTERM/SIGINT in orchestrator_utils.py
- Added shutdown_event coordination across all orchestrators
- 30-second grace period for worker cleanup
- Proper resource cleanup (database pool, connections)

## Performance Optimizations
- A: Reduced database pool size from 4x to 2x workers (-50% connections)
- B: Added module import cache in load_orchestrator.py (50-100x speedup)

## Bug Fixes
- Fixed error accumulation in general.py (was overwriting instead of extending)
- Removed unsupported pool_pre_ping parameter from orchestrator_utils.py

## Documentation
- Added comprehensive docs: SECURITY_FIXES.md, GRACEFUL_SHUTDOWN.md,
  MYSQL_CONNECTOR_MIGRATION.md, OPTIMIZATIONS_AB.md, TESTING_GUIDE.md

## Testing
- Created test_db_connection.py (6 async connection tests)
- Created test_ftp_migration.py (4 FTP functionality tests)

Impact: High security improvement, better resource efficiency, graceful
deployment management, and 2-5% throughput improvement.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-11 21:24:50 +02:00

90 lines
3.4 KiB
Python

import glob
import logging
import os
from itertools import chain, cycle
logger = logging.getLogger()
def alterna_valori(*valori: any, ping_pong: bool = False) -> any:
"""
Genera una sequenza ciclica di valori, con opzione per una sequenza "ping-pong".
Args:
*valori (any): Uno o più valori da ciclare.
ping_pong (bool, optional): Se True, la sequenza sarà valori -> valori al contrario.
Ad esempio, per (1, 2, 3) diventa 1, 2, 3, 2, 1, 2, 3, ...
Se False, la sequenza è semplicemente ciclica.
Defaults to False.
Yields:
any: Il prossimo valore nella sequenza ciclica.
"""
if not valori:
return
if ping_pong:
# Crea la sequenza ping-pong: valori + valori al contrario (senza ripetere primo e ultimo)
forward = valori
backward = valori[-2:0:-1] # Esclude ultimo e primo elemento
ping_pong_sequence = chain(forward, backward)
yield from cycle(ping_pong_sequence)
else:
yield from cycle(valori)
async def read_error_lines_from_logs(base_path: str, pattern: str) -> tuple[list[str], list[str]]:
"""
Reads error and warning lines from log files matching a given pattern within a base path.
This asynchronous function searches for log files, reads their content, and categorizes
lines starting with 'Error' as errors and all other non-empty lines as warnings.
Args:
base_path (str): The base directory where log files are located.
pattern (str): The glob-style pattern to match log filenames (e.g., "*.txt", "prefix_*_output_error.txt").
Returns:
tuple[list[str], list[str]]: A tuple containing two lists:
- The first list contains all extracted error messages.
- The second list contains all extracted warning messages."""
import aiofiles
# Costruisce il path completo con il pattern
search_pattern = os.path.join(base_path, pattern)
# Trova tutti i file che corrispondono al pattern
matching_files = glob.glob(search_pattern)
if not matching_files:
logger.warning(f"Nessun file trovato per il pattern: {search_pattern}")
return [], []
all_errors = []
all_warnings = []
for file_path in matching_files:
try:
# Use async file I/O to prevent blocking the event loop
async with aiofiles.open(file_path, encoding="utf-8") as file:
content = await file.read()
lines = content.splitlines()
# Usando dict.fromkeys() per mantenere l'ordine e togliere le righe duplicate per i warnings
non_empty_lines = [line.strip() for line in lines if line.strip()]
# Fix: Accumulate errors and warnings from all files instead of overwriting
file_errors = [line for line in non_empty_lines if line.startswith("Error")]
file_warnings = [line for line in non_empty_lines if not line.startswith("Error")]
all_errors.extend(file_errors)
all_warnings.extend(file_warnings)
except Exception as e:
logger.error(f"Errore durante la lettura del file {file_path}: {e}")
# Remove duplicates from warnings while preserving order
unique_warnings = list(dict.fromkeys(all_warnings))
return all_errors, unique_warnings