mirror of
https://github.com/blw1138/Zordon.git
synced 2026-02-05 13:46:10 +00:00
Compare commits
8 Commits
master
...
f6ee57fb55
| Author | SHA1 | Date | |
|---|---|---|---|
| f6ee57fb55 | |||
| 2ba99cee31 | |||
|
|
13a82a540a | ||
|
|
e7cecf6009 | ||
|
|
2fdabd3a9d | ||
|
|
2691c759ad | ||
|
|
a036b8244f | ||
|
|
7b0d9a0b9f |
194
README.md
194
README.md
@@ -4,193 +4,43 @@
|
|||||||
|
|
||||||
# Zordon
|
# Zordon
|
||||||
|
|
||||||
A Python-based distributed rendering management tool that supports Blender, FFmpeg, and other render engines. Zordon efficiently manages render jobs across multiple machines, making it ideal for small render farms in home studios or small businesses.
|
A lightweight, zero-install, distributed rendering and management tool designed to streamline and optimize rendering workflows across multiple machines
|
||||||
|
|
||||||
## What is Zordon?
|
## What is Zordon?
|
||||||
|
|
||||||
Zordon is a tool designed for small render farms, such as those used in home studios or small businesses, to efficiently manage and run render jobs for Blender, FFmpeg, and other video renderers. It simplifies the process of distributing rendering tasks across multiple available machines, optimizing the rendering workflow for artists, animators, and video professionals.
|
Zordon is tool designed for small render farms, such as those used in home studios or small businesses, to efficiently manage and run render jobs for Blender, FFMPEG, and other video renderers. It simplifies the process of distributing rendering tasks across multiple available machines, optimizing the rendering workflow for artists, animators, and video professionals.
|
||||||
|
|
||||||
The system works by:
|
|
||||||
- **Server**: Central coordinator that manages job queues and distributes tasks to available workers
|
|
||||||
- **Clients**: Lightweight workers that run on rendering machines and execute assigned jobs
|
|
||||||
- **API**: RESTful endpoints for programmatic job submission and monitoring
|
|
||||||
|
|
||||||
## Features
|
|
||||||
|
|
||||||
- **Distributed Rendering**: Queue and distribute render jobs across multiple machines
|
|
||||||
- **Multi-Engine Support**: Compatible with Blender, FFmpeg, and extensible to other render engines
|
|
||||||
- **Desktop UI**: PyQt6 interface for job management and monitoring
|
|
||||||
- **REST API**: Flask-based API for programmatic access
|
|
||||||
- **Cross-Platform**: Runs on Windows, macOS, and Linux
|
|
||||||
- **Zero-Install Clients**: Lightweight client executables for worker machines
|
|
||||||
|
|
||||||
## Installation
|
|
||||||
|
|
||||||
### Prerequisites
|
|
||||||
|
|
||||||
- Python 3.11 or later
|
|
||||||
- Git
|
|
||||||
|
|
||||||
### Setup
|
|
||||||
|
|
||||||
1. Clone the repository:
|
|
||||||
```bash
|
|
||||||
git clone https://github.com/blw1138/Zordon.git
|
|
||||||
cd Zordon
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Install dependencies:
|
|
||||||
```bash
|
|
||||||
pip install -r requirements.txt
|
|
||||||
```
|
|
||||||
|
|
||||||
3. (Optional) Install PyInstaller for building executables:
|
|
||||||
```bash
|
|
||||||
pip install pyinstaller pyinstaller_versionfile
|
|
||||||
```
|
|
||||||
|
|
||||||
## Usage
|
|
||||||
|
|
||||||
### Quick Start
|
|
||||||
|
|
||||||
1. **Start the Server**: Run the central server to coordinate jobs.
|
|
||||||
```bash
|
|
||||||
python server.py
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Launch Clients**: On each rendering machine, run the client to connect to the server.
|
|
||||||
```bash
|
|
||||||
python client.py
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
|
Notice: This should be considered a beta and is meant for casual / hobbiest use. Do not use in mission critical environments!
|
||||||
|
|
||||||
### Detailed Workflow
|
## Supported Renderers
|
||||||
|
|
||||||
#### Setting Up a Render Farm
|
Zordon supports or plans to support the following renderers:
|
||||||
|
|
||||||
1. Choose one machine as the server (preferably a dedicated machine with good network connectivity).
|
|
||||||
2. Build and distribute client executables to worker machines:
|
|
||||||
```bash
|
|
||||||
pyinstaller client.spec
|
|
||||||
```
|
|
||||||
Copy the generated executable to each worker machine.
|
|
||||||
|
|
||||||
3. Ensure all machines can communicate via network (same subnet recommended).
|
|
||||||
|
|
||||||
#### Submitting Render Jobs
|
|
||||||
|
|
||||||
Jobs can be submitted via the desktop UI or programmatically via the API:
|
|
||||||
|
|
||||||
- **Via UI**: Use the desktop interface to upload project files, specify render settings, and queue jobs.
|
|
||||||
- **Via API**: Send POST requests to `/api/jobs` with job configuration in JSON format.
|
|
||||||
|
|
||||||
Example API request:
|
|
||||||
```bash
|
|
||||||
curl -X POST http://localhost:5000/api/jobs \
|
|
||||||
-H "Content-Type: application/json" \
|
|
||||||
-d '{
|
|
||||||
"engine": "blender",
|
|
||||||
"project_path": "/path/to/project.blend",
|
|
||||||
"output_path": "/path/to/output",
|
|
||||||
"frames": "1-100",
|
|
||||||
"settings": {"resolution": "1920x1080"}
|
|
||||||
}'
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Monitoring and Managing Jobs
|
|
||||||
|
|
||||||
- **UI**: View job status, progress, logs, and worker availability in real-time.
|
|
||||||
- **API Endpoints**:
|
|
||||||
- `GET /api/jobs`: List all jobs
|
|
||||||
- `GET /api/jobs/{id}`: Get job details
|
|
||||||
- `DELETE /api/jobs/{id}`: Cancel a job
|
|
||||||
- `GET /api/workers`: List connected workers
|
|
||||||
|
|
||||||
#### Worker Management
|
|
||||||
|
|
||||||
Workers automatically connect to the server when started. You can:
|
|
||||||
- View worker status and capabilities in the dashboard
|
|
||||||
- Configure worker priorities and resource limits
|
|
||||||
- Monitor CPU/GPU usage per worker
|
|
||||||
|
|
||||||
### Development Mode
|
|
||||||
|
|
||||||
For development and testing:
|
|
||||||
|
|
||||||
Run the server:
|
|
||||||
```bash
|
|
||||||
python server.py
|
|
||||||
```
|
|
||||||
|
|
||||||
Run a client (can run multiple for testing):
|
|
||||||
```bash
|
|
||||||
python client.py
|
|
||||||
```
|
|
||||||
|
|
||||||
|
|
||||||
### Building Executables
|
|
||||||
|
|
||||||
Build server executable:
|
|
||||||
```bash
|
|
||||||
pyinstaller server.spec
|
|
||||||
```
|
|
||||||
|
|
||||||
Build client executable:
|
|
||||||
```bash
|
|
||||||
pyinstaller client.spec
|
|
||||||
```
|
|
||||||
|
|
||||||
## Configuration
|
|
||||||
|
|
||||||
Settings are stored in `src/utilities/config.py`. Supports YAML/JSON for data serialization and environment-specific configurations.
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
Zordon follows a modular architecture with the following key components:
|
|
||||||
|
|
||||||
- **API Server** (`src/api/`): Flask-based REST API
|
|
||||||
- **Engine System** (`src/engines/`): Pluggable render engines (Blender, FFmpeg, etc.)
|
|
||||||
- **UI** (`src/ui/`): PyQt6-based interface
|
|
||||||
- **Job Management** (`src/render_queue.py`): Distributed job queue
|
|
||||||
|
|
||||||
Design patterns include Factory Pattern for engine creation, Observer Pattern for status updates, and Strategy Pattern for different worker implementations.
|
|
||||||
|
|
||||||
## Contributing
|
|
||||||
|
|
||||||
1. Fork the repository
|
|
||||||
2. Create a feature branch: `git checkout -b feature/your-feature`
|
|
||||||
3. Follow the code style guidelines in `AGENTS.md`
|
|
||||||
4. Test the build: `pyinstaller server.spec`
|
|
||||||
5. Submit a pull request
|
|
||||||
|
|
||||||
### Commit Message Format
|
|
||||||
|
|
||||||
```
|
|
||||||
feat: add support for new render engine
|
|
||||||
fix: resolve crash when engine path is invalid
|
|
||||||
docs: update API documentation
|
|
||||||
refactor: simplify job status handling
|
|
||||||
```
|
|
||||||
|
|
||||||
## Supported Render Engines
|
|
||||||
|
|
||||||
- **Blender**
|
- **Blender**
|
||||||
- **FFmpeg**
|
- **FFMPEG**
|
||||||
- **Adobe After Effects** (planned)
|
- **Adobe After Effects** ([coming soon](https://github.com/blw1138/Zordon/issues/84))
|
||||||
- **Cinema 4D** (planned)
|
- **Cinema 4D** ([planned](https://github.com/blw1138/Zordon/issues/105))
|
||||||
- **Autodesk Maya** (planned)
|
- **Autodesk Maya** ([planned](https://github.com/blw1138/Zordon/issues/106))
|
||||||
|
|
||||||
## System Requirements
|
## System Requirements
|
||||||
|
|
||||||
- Windows 10 or later
|
- Windows 10 or later
|
||||||
- macOS Ventura (13.0) or later
|
- macOS Ventura (13.0) or later
|
||||||
- Linux (supported versions TBD)
|
- Linux (Supported versions TBD)
|
||||||
|
|
||||||
|
## Build using Pyinstaller
|
||||||
|
|
||||||
|
Zordon is regularly tested with Python 3.11 and later. It's packaged and distributed with pyinstaller. It is supported on Windows, macOS and Linux.
|
||||||
|
|
||||||
|
```
|
||||||
|
git clone https://github.com/blw1138/Zordon.git
|
||||||
|
pip3 install -r requirements.txt
|
||||||
|
pip3 install pyinstaller
|
||||||
|
pip3 install pyinstaller_versionfile
|
||||||
|
pyinstaller main.spec
|
||||||
|
```
|
||||||
|
|
||||||
## License
|
## License
|
||||||
|
|
||||||
Zordon is licensed under the MIT License. See the [LICENSE](LICENSE.txt) file for more details.
|
Zordon is licensed under the MIT License. See the [LICENSE](LICENSE.txt) file for more details.
|
||||||
|
|
||||||
## Notice
|
|
||||||
|
|
||||||
This software is in beta and intended for casual/hobbyist use. Not recommended for mission-critical environments.
|
|
||||||
239
client.spec
239
client.spec
@@ -1,158 +1,121 @@
|
|||||||
# -*- mode: python ; coding: utf-8 -*-
|
# -*- mode: python ; coding: utf-8 -*-
|
||||||
|
|
||||||
from PyInstaller.utils.hooks import collect_all
|
from PyInstaller.utils.hooks import collect_all
|
||||||
from pathlib import Path
|
|
||||||
|
# - get version from version file
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import platform
|
import platform
|
||||||
|
src_path = os.path.abspath("src")
|
||||||
|
sys.path.insert(0, src_path)
|
||||||
|
from version import APP_NAME, APP_VERSION, APP_AUTHOR
|
||||||
|
sys.path.insert(0, os.path.abspath('.'))
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
datas = [('resources', 'resources'), ('src/engines/blender/scripts/', 'src/engines/blender/scripts')]
|
||||||
# Project paths
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
project_root = Path(SPECPATH).resolve()
|
|
||||||
src_dir = project_root / "src"
|
|
||||||
|
|
||||||
# Ensure `src.*` imports work during analysis
|
|
||||||
sys.path.insert(0, str(project_root))
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Import version info
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
from src.version import APP_NAME, APP_VERSION, APP_AUTHOR
|
|
||||||
|
|
||||||
APP_NAME = f"{APP_NAME}-client"
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# PyInstaller data / imports
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
datas = [
|
|
||||||
("resources", "resources"),
|
|
||||||
("src/engines/blender/scripts", "src/engines/blender/scripts"),
|
|
||||||
]
|
|
||||||
|
|
||||||
binaries = []
|
binaries = []
|
||||||
hiddenimports = ["zeroconf", "src.version"]
|
hiddenimports = ['zeroconf']
|
||||||
|
tmp_ret = collect_all('zeroconf')
|
||||||
|
datas += tmp_ret[0]; binaries += tmp_ret[1]; hiddenimports += tmp_ret[2]
|
||||||
|
|
||||||
tmp_ret = collect_all("zeroconf")
|
|
||||||
datas += tmp_ret[0]
|
|
||||||
binaries += tmp_ret[1]
|
|
||||||
hiddenimports += tmp_ret[2]
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Analysis
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
a = Analysis(
|
a = Analysis(
|
||||||
["client.py"],
|
['client.py'],
|
||||||
pathex=[str(project_root)],
|
pathex=[],
|
||||||
binaries=binaries,
|
binaries=binaries,
|
||||||
datas=datas,
|
datas=datas,
|
||||||
hiddenimports=hiddenimports,
|
hiddenimports=hiddenimports,
|
||||||
hookspath=[],
|
hookspath=[],
|
||||||
hooksconfig={},
|
hooksconfig={},
|
||||||
runtime_hooks=[],
|
runtime_hooks=[],
|
||||||
excludes=[],
|
excludes=[],
|
||||||
noarchive=False,
|
noarchive=False,
|
||||||
optimize=1, # optimize=2 breaks Windows builds
|
optimize=1, # fyi: optim level 2 breaks on windows
|
||||||
)
|
)
|
||||||
|
|
||||||
pyz = PYZ(a.pure)
|
pyz = PYZ(a.pure)
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
if platform.system() == 'Darwin': # macOS
|
||||||
# Platform targets
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
if platform.system() == "Darwin": # macOS
|
exe = EXE(
|
||||||
|
pyz,
|
||||||
|
a.scripts,
|
||||||
|
[],
|
||||||
|
exclude_binaries=True,
|
||||||
|
name=APP_NAME,
|
||||||
|
debug=False,
|
||||||
|
bootloader_ignore_signals=False,
|
||||||
|
strip=True,
|
||||||
|
upx=True,
|
||||||
|
console=False,
|
||||||
|
disable_windowed_traceback=False,
|
||||||
|
argv_emulation=False,
|
||||||
|
target_arch=None,
|
||||||
|
codesign_identity=None,
|
||||||
|
entitlements_file=None,
|
||||||
|
)
|
||||||
|
app = BUNDLE(
|
||||||
|
exe,
|
||||||
|
a.binaries,
|
||||||
|
a.datas,
|
||||||
|
strip=True,
|
||||||
|
name=f'{APP_NAME}.app',
|
||||||
|
icon='resources/Server.png',
|
||||||
|
bundle_identifier=None,
|
||||||
|
version=APP_VERSION
|
||||||
|
)
|
||||||
|
|
||||||
exe = EXE(
|
elif platform.system() == 'Windows':
|
||||||
pyz,
|
|
||||||
a.scripts,
|
|
||||||
[],
|
|
||||||
exclude_binaries=True,
|
|
||||||
name=APP_NAME,
|
|
||||||
debug=False,
|
|
||||||
bootloader_ignore_signals=False,
|
|
||||||
strip=True,
|
|
||||||
upx=True,
|
|
||||||
console=False,
|
|
||||||
disable_windowed_traceback=False,
|
|
||||||
argv_emulation=False,
|
|
||||||
target_arch=None,
|
|
||||||
codesign_identity=None,
|
|
||||||
entitlements_file=None,
|
|
||||||
)
|
|
||||||
|
|
||||||
app = BUNDLE(
|
import pyinstaller_versionfile
|
||||||
exe,
|
import tempfile
|
||||||
a.binaries,
|
|
||||||
a.datas,
|
|
||||||
strip=True,
|
|
||||||
name=f"{APP_NAME}.app",
|
|
||||||
icon="resources/Server.png",
|
|
||||||
bundle_identifier=None,
|
|
||||||
version=APP_VERSION,
|
|
||||||
)
|
|
||||||
|
|
||||||
elif platform.system() == "Windows":
|
version_file_path = os.path.join(tempfile.gettempdir(), 'versionfile.txt')
|
||||||
|
|
||||||
import pyinstaller_versionfile
|
pyinstaller_versionfile.create_versionfile(
|
||||||
import tempfile
|
output_file=version_file_path,
|
||||||
|
version=APP_VERSION,
|
||||||
|
company_name=APP_AUTHOR,
|
||||||
|
file_description=APP_NAME,
|
||||||
|
internal_name=APP_NAME,
|
||||||
|
legal_copyright=f"© {APP_AUTHOR}",
|
||||||
|
original_filename=f"{APP_NAME}.exe",
|
||||||
|
product_name=APP_NAME
|
||||||
|
)
|
||||||
|
|
||||||
version_file_path = os.path.join(
|
exe = EXE(
|
||||||
tempfile.gettempdir(), "versionfile.txt"
|
pyz,
|
||||||
)
|
a.scripts,
|
||||||
|
a.binaries,
|
||||||
|
a.datas,
|
||||||
|
[],
|
||||||
|
name=APP_NAME,
|
||||||
|
debug=False,
|
||||||
|
bootloader_ignore_signals=False,
|
||||||
|
strip=True,
|
||||||
|
upx=True,
|
||||||
|
console=False,
|
||||||
|
disable_windowed_traceback=False,
|
||||||
|
argv_emulation=False,
|
||||||
|
target_arch=None,
|
||||||
|
codesign_identity=None,
|
||||||
|
entitlements_file=None,
|
||||||
|
version=version_file_path
|
||||||
|
)
|
||||||
|
|
||||||
pyinstaller_versionfile.create_versionfile(
|
else: # linux
|
||||||
output_file=version_file_path,
|
exe = EXE(
|
||||||
version=APP_VERSION,
|
pyz,
|
||||||
company_name=APP_AUTHOR,
|
a.scripts,
|
||||||
file_description=APP_NAME,
|
a.binaries,
|
||||||
internal_name=APP_NAME,
|
a.datas,
|
||||||
legal_copyright=f"© {APP_AUTHOR}",
|
[],
|
||||||
original_filename=f"{APP_NAME}.exe",
|
name=APP_NAME,
|
||||||
product_name=APP_NAME,
|
debug=False,
|
||||||
)
|
bootloader_ignore_signals=False,
|
||||||
|
strip=True,
|
||||||
exe = EXE(
|
upx=True,
|
||||||
pyz,
|
console=False,
|
||||||
a.scripts,
|
disable_windowed_traceback=False,
|
||||||
a.binaries,
|
argv_emulation=False,
|
||||||
a.datas,
|
target_arch=None,
|
||||||
[],
|
codesign_identity=None,
|
||||||
name=APP_NAME,
|
entitlements_file=None
|
||||||
debug=False,
|
)
|
||||||
bootloader_ignore_signals=False,
|
|
||||||
strip=True,
|
|
||||||
upx=True,
|
|
||||||
console=False,
|
|
||||||
disable_windowed_traceback=False,
|
|
||||||
argv_emulation=False,
|
|
||||||
target_arch=None,
|
|
||||||
codesign_identity=None,
|
|
||||||
entitlements_file=None,
|
|
||||||
version=version_file_path,
|
|
||||||
)
|
|
||||||
|
|
||||||
else: # Linux
|
|
||||||
|
|
||||||
exe = EXE(
|
|
||||||
pyz,
|
|
||||||
a.scripts,
|
|
||||||
a.binaries,
|
|
||||||
a.datas,
|
|
||||||
[],
|
|
||||||
name=APP_NAME,
|
|
||||||
debug=False,
|
|
||||||
bootloader_ignore_signals=False,
|
|
||||||
strip=True,
|
|
||||||
upx=True,
|
|
||||||
console=False,
|
|
||||||
disable_windowed_traceback=False,
|
|
||||||
argv_emulation=False,
|
|
||||||
target_arch=None,
|
|
||||||
codesign_identity=None,
|
|
||||||
entitlements_file=None,
|
|
||||||
)
|
|
||||||
|
|||||||
24
server.py
24
server.py
@@ -3,7 +3,6 @@ import multiprocessing
|
|||||||
import os
|
import os
|
||||||
import socket
|
import socket
|
||||||
import threading
|
import threading
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
import psutil
|
import psutil
|
||||||
|
|
||||||
@@ -15,8 +14,8 @@ from src.distributed_job_manager import DistributedJobManager
|
|||||||
from src.engines.engine_manager import EngineManager
|
from src.engines.engine_manager import EngineManager
|
||||||
from src.render_queue import RenderQueue
|
from src.render_queue import RenderQueue
|
||||||
from src.utilities.config import Config
|
from src.utilities.config import Config
|
||||||
from src.utilities.misc_helper import (get_gpu_info, current_system_cpu, current_system_os,
|
from src.utilities.misc_helper import (get_gpu_info, system_safe_path, current_system_cpu, current_system_os,
|
||||||
current_system_os_version, current_system_cpu_brand)
|
current_system_os_version, current_system_cpu_brand, check_for_updates)
|
||||||
from src.utilities.zeroconf_server import ZeroconfServer
|
from src.utilities.zeroconf_server import ZeroconfServer
|
||||||
from src.version import APP_NAME, APP_VERSION
|
from src.version import APP_NAME, APP_VERSION
|
||||||
|
|
||||||
@@ -34,20 +33,23 @@ class ZordonServer:
|
|||||||
|
|
||||||
# Load Config YAML
|
# Load Config YAML
|
||||||
Config.setup_config_dir()
|
Config.setup_config_dir()
|
||||||
config_path = Path(Config.config_dir()) / "config.yaml"
|
Config.load_config(system_safe_path(os.path.join(Config.config_dir(), 'config.yaml')))
|
||||||
Config.load_config(config_path)
|
|
||||||
|
|
||||||
# configure default paths
|
# configure default paths
|
||||||
EngineManager.engines_path = str(Path(Config.upload_folder).expanduser()/ "engines")
|
EngineManager.engines_path = system_safe_path(
|
||||||
|
os.path.join(os.path.join(os.path.expanduser(Config.upload_folder),
|
||||||
|
'engines')))
|
||||||
os.makedirs(EngineManager.engines_path, exist_ok=True)
|
os.makedirs(EngineManager.engines_path, exist_ok=True)
|
||||||
PreviewManager.storage_path = Path(Config.upload_folder).expanduser() / "previews"
|
PreviewManager.storage_path = system_safe_path(
|
||||||
|
os.path.join(os.path.expanduser(Config.upload_folder), 'previews'))
|
||||||
|
|
||||||
self.api_server = None
|
self.api_server = None
|
||||||
self.server_hostname: str = socket.gethostname()
|
self.server_hostname = None
|
||||||
|
|
||||||
def start_server(self):
|
def start_server(self):
|
||||||
|
|
||||||
def existing_process(process_name):
|
def existing_process(process_name):
|
||||||
|
import psutil
|
||||||
current_pid = os.getpid()
|
current_pid = os.getpid()
|
||||||
current_process = psutil.Process(current_pid)
|
current_process = psutil.Process(current_pid)
|
||||||
for proc in psutil.process_iter(['pid', 'name', 'ppid']):
|
for proc in psutil.process_iter(['pid', 'name', 'ppid']):
|
||||||
@@ -70,15 +72,15 @@ class ZordonServer:
|
|||||||
|
|
||||||
# main start
|
# main start
|
||||||
logger.info(f"Starting {APP_NAME} Render Server ({APP_VERSION})")
|
logger.info(f"Starting {APP_NAME} Render Server ({APP_VERSION})")
|
||||||
logger.debug(f"Upload directory: {Path(Config.upload_folder).expanduser()}")
|
logger.debug(f"Upload directory: {os.path.expanduser(Config.upload_folder)}")
|
||||||
logger.debug(f"Thumbs directory: {PreviewManager.storage_path}")
|
logger.debug(f"Thumbs directory: {PreviewManager.storage_path}")
|
||||||
logger.debug(f"Engines directory: {EngineManager.engines_path}")
|
logger.debug(f"Engines directory: {EngineManager.engines_path}")
|
||||||
# Set up the RenderQueue object
|
# Set up the RenderQueue object
|
||||||
RenderQueue.load_state(database_directory=Path(Config.upload_folder).expanduser())
|
RenderQueue.load_state(database_directory=system_safe_path(os.path.expanduser(Config.upload_folder)))
|
||||||
ServerProxyManager.subscribe_to_listener()
|
ServerProxyManager.subscribe_to_listener()
|
||||||
DistributedJobManager.subscribe_to_listener()
|
DistributedJobManager.subscribe_to_listener()
|
||||||
|
|
||||||
# update hostname
|
# get hostname
|
||||||
self.server_hostname = socket.gethostname()
|
self.server_hostname = socket.gethostname()
|
||||||
|
|
||||||
# configure and start API server
|
# configure and start API server
|
||||||
|
|||||||
212
server.spec
212
server.spec
@@ -1,158 +1,90 @@
|
|||||||
# -*- mode: python ; coding: utf-8 -*-
|
# -*- mode: python ; coding: utf-8 -*-
|
||||||
|
|
||||||
from PyInstaller.utils.hooks import collect_all
|
from PyInstaller.utils.hooks import collect_all
|
||||||
from pathlib import Path
|
|
||||||
|
# - get version from version file
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import platform
|
import platform
|
||||||
|
sys.path.insert(0, os.path.abspath('.'))
|
||||||
|
from version import APP_NAME, APP_VERSION, APP_AUTHOR
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
APP_NAME = APP_NAME + " Server"
|
||||||
# Project paths
|
datas = [('resources', 'resources'), ('src/engines/blender/scripts/', 'src/engines/blender/scripts')]
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
project_root = Path(SPECPATH).resolve()
|
|
||||||
src_dir = project_root / "src"
|
|
||||||
|
|
||||||
# Ensure `src.*` imports work during analysis
|
|
||||||
sys.path.insert(0, str(project_root))
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Import version info
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
from src.version import APP_NAME, APP_VERSION, APP_AUTHOR
|
|
||||||
|
|
||||||
APP_NAME = f"{APP_NAME}-server"
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# PyInstaller data / imports
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
datas = [
|
|
||||||
("resources", "resources"),
|
|
||||||
("src/engines/blender/scripts", "src/engines/blender/scripts"),
|
|
||||||
]
|
|
||||||
|
|
||||||
binaries = []
|
binaries = []
|
||||||
hiddenimports = ["zeroconf", "src.version"]
|
hiddenimports = ['zeroconf']
|
||||||
|
tmp_ret = collect_all('zeroconf')
|
||||||
|
datas += tmp_ret[0]; binaries += tmp_ret[1]; hiddenimports += tmp_ret[2]
|
||||||
|
|
||||||
tmp_ret = collect_all("zeroconf")
|
|
||||||
datas += tmp_ret[0]
|
|
||||||
binaries += tmp_ret[1]
|
|
||||||
hiddenimports += tmp_ret[2]
|
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
# Analysis
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
a = Analysis(
|
a = Analysis(
|
||||||
["server.py"],
|
['server.py'],
|
||||||
pathex=[str(project_root)],
|
pathex=[],
|
||||||
binaries=binaries,
|
binaries=binaries,
|
||||||
datas=datas,
|
datas=datas,
|
||||||
hiddenimports=hiddenimports,
|
hiddenimports=hiddenimports,
|
||||||
hookspath=[],
|
hookspath=[],
|
||||||
hooksconfig={},
|
hooksconfig={},
|
||||||
runtime_hooks=[],
|
runtime_hooks=[],
|
||||||
excludes=[],
|
excludes=[],
|
||||||
noarchive=False,
|
noarchive=False,
|
||||||
optimize=1, # optimize=2 breaks Windows builds
|
optimize=1, # fyi: optim level 2 breaks on windows
|
||||||
)
|
)
|
||||||
|
|
||||||
pyz = PYZ(a.pure)
|
pyz = PYZ(a.pure)
|
||||||
|
|
||||||
# ------------------------------------------------------------
|
if platform.system() == 'Windows':
|
||||||
# Platform targets
|
|
||||||
# ------------------------------------------------------------
|
|
||||||
|
|
||||||
if platform.system() == "Darwin": # macOS
|
import pyinstaller_versionfile
|
||||||
|
import tempfile
|
||||||
|
|
||||||
exe = EXE(
|
version_file_path = os.path.join(tempfile.gettempdir(), 'versionfile.txt')
|
||||||
pyz,
|
|
||||||
a.scripts,
|
|
||||||
[],
|
|
||||||
exclude_binaries=True,
|
|
||||||
name=APP_NAME,
|
|
||||||
debug=False,
|
|
||||||
bootloader_ignore_signals=False,
|
|
||||||
strip=True,
|
|
||||||
upx=True,
|
|
||||||
console=False,
|
|
||||||
disable_windowed_traceback=False,
|
|
||||||
argv_emulation=False,
|
|
||||||
target_arch=None,
|
|
||||||
codesign_identity=None,
|
|
||||||
entitlements_file=None,
|
|
||||||
)
|
|
||||||
|
|
||||||
app = BUNDLE(
|
pyinstaller_versionfile.create_versionfile(
|
||||||
exe,
|
output_file=version_file_path,
|
||||||
a.binaries,
|
version=APP_VERSION,
|
||||||
a.datas,
|
company_name=APP_AUTHOR,
|
||||||
strip=True,
|
file_description=APP_NAME,
|
||||||
name=f"{APP_NAME}.app",
|
internal_name=APP_NAME,
|
||||||
icon="resources/Server.png",
|
legal_copyright=f"© {APP_AUTHOR}",
|
||||||
bundle_identifier=None,
|
original_filename=f"{APP_NAME}.exe",
|
||||||
version=APP_VERSION,
|
product_name=APP_NAME
|
||||||
)
|
)
|
||||||
|
|
||||||
elif platform.system() == "Windows":
|
exe = EXE(
|
||||||
|
pyz,
|
||||||
|
a.scripts,
|
||||||
|
a.binaries,
|
||||||
|
a.datas,
|
||||||
|
[],
|
||||||
|
name=APP_NAME,
|
||||||
|
debug=False,
|
||||||
|
bootloader_ignore_signals=False,
|
||||||
|
strip=True,
|
||||||
|
upx=True,
|
||||||
|
console=True,
|
||||||
|
disable_windowed_traceback=False,
|
||||||
|
argv_emulation=False,
|
||||||
|
target_arch=None,
|
||||||
|
codesign_identity=None,
|
||||||
|
entitlements_file=None,
|
||||||
|
version=version_file_path
|
||||||
|
)
|
||||||
|
|
||||||
import pyinstaller_versionfile
|
else: # linux / macOS
|
||||||
import tempfile
|
exe = EXE(
|
||||||
|
pyz,
|
||||||
version_file_path = os.path.join(
|
a.scripts,
|
||||||
tempfile.gettempdir(), "versionfile.txt"
|
a.binaries,
|
||||||
)
|
a.datas,
|
||||||
|
[],
|
||||||
pyinstaller_versionfile.create_versionfile(
|
name=APP_NAME,
|
||||||
output_file=version_file_path,
|
debug=False,
|
||||||
version=APP_VERSION,
|
bootloader_ignore_signals=False,
|
||||||
company_name=APP_AUTHOR,
|
strip=True,
|
||||||
file_description=APP_NAME,
|
upx=True,
|
||||||
internal_name=APP_NAME,
|
console=False,
|
||||||
legal_copyright=f"© {APP_AUTHOR}",
|
disable_windowed_traceback=False,
|
||||||
original_filename=f"{APP_NAME}.exe",
|
argv_emulation=False,
|
||||||
product_name=APP_NAME,
|
target_arch=None,
|
||||||
)
|
codesign_identity=None,
|
||||||
|
entitlements_file=None
|
||||||
exe = EXE(
|
)
|
||||||
pyz,
|
|
||||||
a.scripts,
|
|
||||||
a.binaries,
|
|
||||||
a.datas,
|
|
||||||
[],
|
|
||||||
name=APP_NAME,
|
|
||||||
debug=False,
|
|
||||||
bootloader_ignore_signals=False,
|
|
||||||
strip=True,
|
|
||||||
upx=True,
|
|
||||||
console=False,
|
|
||||||
disable_windowed_traceback=False,
|
|
||||||
argv_emulation=False,
|
|
||||||
target_arch=None,
|
|
||||||
codesign_identity=None,
|
|
||||||
entitlements_file=None,
|
|
||||||
version=version_file_path,
|
|
||||||
)
|
|
||||||
|
|
||||||
else: # Linux
|
|
||||||
|
|
||||||
exe = EXE(
|
|
||||||
pyz,
|
|
||||||
a.scripts,
|
|
||||||
a.binaries,
|
|
||||||
a.datas,
|
|
||||||
[],
|
|
||||||
name=APP_NAME,
|
|
||||||
debug=False,
|
|
||||||
bootloader_ignore_signals=False,
|
|
||||||
strip=True,
|
|
||||||
upx=True,
|
|
||||||
console=False,
|
|
||||||
disable_windowed_traceback=False,
|
|
||||||
argv_emulation=False,
|
|
||||||
target_arch=None,
|
|
||||||
codesign_identity=None,
|
|
||||||
entitlements_file=None,
|
|
||||||
)
|
|
||||||
|
|||||||
@@ -2,15 +2,14 @@
|
|||||||
import concurrent.futures
|
import concurrent.futures
|
||||||
import json
|
import json
|
||||||
import logging
|
import logging
|
||||||
|
import os
|
||||||
|
import pathlib
|
||||||
import shutil
|
import shutil
|
||||||
import socket
|
import socket
|
||||||
import ssl
|
import ssl
|
||||||
import tempfile
|
import tempfile
|
||||||
import time
|
import time
|
||||||
import traceback
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pathlib import Path
|
|
||||||
from typing import Dict, Any, Optional
|
|
||||||
|
|
||||||
import cpuinfo
|
import cpuinfo
|
||||||
import psutil
|
import psutil
|
||||||
@@ -24,7 +23,7 @@ from src.distributed_job_manager import DistributedJobManager
|
|||||||
from src.engines.engine_manager import EngineManager
|
from src.engines.engine_manager import EngineManager
|
||||||
from src.render_queue import RenderQueue, JobNotFoundError
|
from src.render_queue import RenderQueue, JobNotFoundError
|
||||||
from src.utilities.config import Config
|
from src.utilities.config import Config
|
||||||
from src.utilities.misc_helper import current_system_os, current_system_cpu, \
|
from src.utilities.misc_helper import system_safe_path, current_system_os, current_system_cpu, \
|
||||||
current_system_os_version, num_to_alphanumeric, get_gpu_info
|
current_system_os_version, num_to_alphanumeric, get_gpu_info
|
||||||
from src.utilities.status_utils import string_to_status
|
from src.utilities.status_utils import string_to_status
|
||||||
from src.version import APP_VERSION
|
from src.version import APP_VERSION
|
||||||
@@ -35,7 +34,7 @@ ssl._create_default_https_context = ssl._create_unverified_context # disable SS
|
|||||||
|
|
||||||
API_VERSION = "0.1"
|
API_VERSION = "0.1"
|
||||||
|
|
||||||
def start_api_server(hostname: Optional[str] = None) -> None:
|
def start_api_server(hostname=None):
|
||||||
|
|
||||||
# get hostname
|
# get hostname
|
||||||
if not hostname:
|
if not hostname:
|
||||||
@@ -45,7 +44,7 @@ def start_api_server(hostname: Optional[str] = None) -> None:
|
|||||||
# load flask settings
|
# load flask settings
|
||||||
server.config['HOSTNAME'] = hostname
|
server.config['HOSTNAME'] = hostname
|
||||||
server.config['PORT'] = int(Config.port_number)
|
server.config['PORT'] = int(Config.port_number)
|
||||||
server.config['UPLOAD_FOLDER'] = str(Path(Config.upload_folder).expanduser())
|
server.config['UPLOAD_FOLDER'] = system_safe_path(os.path.expanduser(Config.upload_folder))
|
||||||
server.config['MAX_CONTENT_PATH'] = Config.max_content_path
|
server.config['MAX_CONTENT_PATH'] = Config.max_content_path
|
||||||
server.config['enable_split_jobs'] = Config.enable_split_jobs
|
server.config['enable_split_jobs'] = Config.enable_split_jobs
|
||||||
|
|
||||||
@@ -66,7 +65,7 @@ def start_api_server(hostname: Optional[str] = None) -> None:
|
|||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
@server.get('/api/jobs')
|
@server.get('/api/jobs')
|
||||||
def jobs_json() -> Dict[str, Any]:
|
def jobs_json():
|
||||||
"""Retrieves all jobs from the render queue in JSON format.
|
"""Retrieves all jobs from the render queue in JSON format.
|
||||||
|
|
||||||
This endpoint fetches all jobs currently in the render queue, converts them to JSON format,
|
This endpoint fetches all jobs currently in the render queue, converts them to JSON format,
|
||||||
@@ -135,9 +134,9 @@ def get_job_logs(job_id):
|
|||||||
Response: The log file's content as plain text, or an empty response if the log file is not found.
|
Response: The log file's content as plain text, or an empty response if the log file is not found.
|
||||||
"""
|
"""
|
||||||
found_job = RenderQueue.job_with_id(job_id)
|
found_job = RenderQueue.job_with_id(job_id)
|
||||||
log_path = Path(found_job.log_path())
|
log_path = system_safe_path(found_job.log_path())
|
||||||
log_data = None
|
log_data = None
|
||||||
if log_path and log_path.exists():
|
if log_path and os.path.exists(log_path):
|
||||||
with open(log_path) as file:
|
with open(log_path) as file:
|
||||||
log_data = file.read()
|
log_data = file.read()
|
||||||
return Response(log_data, mimetype='text/plain')
|
return Response(log_data, mimetype='text/plain')
|
||||||
@@ -145,21 +144,20 @@ def get_job_logs(job_id):
|
|||||||
|
|
||||||
@server.get('/api/job/<job_id>/file_list')
|
@server.get('/api/job/<job_id>/file_list')
|
||||||
def get_file_list(job_id):
|
def get_file_list(job_id):
|
||||||
return [Path(p).name for p in RenderQueue.job_with_id(job_id).file_list()]
|
return [os.path.basename(x) for x in RenderQueue.job_with_id(job_id).file_list()]
|
||||||
|
|
||||||
|
|
||||||
@server.route('/api/job/<job_id>/download')
|
@server.route('/api/job/<job_id>/download')
|
||||||
def download_requested_file(job_id):
|
def download_requested_file(job_id):
|
||||||
requested_filename = request.args.get("filename")
|
|
||||||
|
requested_filename = request.args.get('filename')
|
||||||
if not requested_filename:
|
if not requested_filename:
|
||||||
return "Filename required", 400
|
return 'Filename required', 400
|
||||||
|
|
||||||
found_job = RenderQueue.job_with_id(job_id)
|
found_job = RenderQueue.job_with_id(job_id)
|
||||||
|
for job_filename in found_job.file_list():
|
||||||
for job_file in found_job.file_list():
|
if os.path.basename(job_filename).lower() == requested_filename.lower():
|
||||||
p = Path(job_file)
|
return send_file(job_filename, as_attachment=True, )
|
||||||
if p.name.lower() == requested_filename.lower():
|
|
||||||
return send_file(str(p), as_attachment=True)
|
|
||||||
|
|
||||||
return f"File '{requested_filename}' not found", 404
|
return f"File '{requested_filename}' not found", 404
|
||||||
|
|
||||||
@@ -170,27 +168,26 @@ def download_all_files(job_id):
|
|||||||
|
|
||||||
@after_this_request
|
@after_this_request
|
||||||
def clear_zip(response):
|
def clear_zip(response):
|
||||||
if zip_filename and zip_filename.exists():
|
if zip_filename and os.path.exists(zip_filename):
|
||||||
try:
|
try:
|
||||||
zip_filename.unlink()
|
os.remove(zip_filename)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.warning(f"Error removing zip file '{zip_filename}': {e}")
|
logger.warning(f"Error removing zip file '{zip_filename}': {e}")
|
||||||
return response
|
return response
|
||||||
|
|
||||||
found_job = RenderQueue.job_with_id(job_id)
|
found_job = RenderQueue.job_with_id(job_id)
|
||||||
|
output_dir = os.path.dirname(found_job.output_path)
|
||||||
output_dir = Path(found_job.output_path).parent
|
if os.path.exists(output_dir):
|
||||||
if not output_dir.exists():
|
from zipfile import ZipFile
|
||||||
return f"Cannot find project files for job {job_id}", 500
|
zip_filename = system_safe_path(os.path.join(tempfile.gettempdir(),
|
||||||
|
pathlib.Path(found_job.input_path).stem + '.zip'))
|
||||||
zip_filename = Path(tempfile.gettempdir()) / f"{Path(found_job.input_path).stem}.zip"
|
with ZipFile(zip_filename, 'w') as zipObj:
|
||||||
from zipfile import ZipFile
|
for f in os.listdir(output_dir):
|
||||||
with ZipFile(zip_filename, "w") as zipObj:
|
zipObj.write(filename=system_safe_path(os.path.join(output_dir, f)),
|
||||||
for f in output_dir.iterdir():
|
arcname=os.path.basename(f))
|
||||||
if f.is_file():
|
return send_file(zip_filename, mimetype="zip", as_attachment=True, )
|
||||||
zipObj.write(f, arcname=f.name)
|
else:
|
||||||
|
return f'Cannot find project files for job {job_id}', 500
|
||||||
return send_file(str(zip_filename), mimetype="zip", as_attachment=True)
|
|
||||||
|
|
||||||
|
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
@@ -198,8 +195,8 @@ def download_all_files(job_id):
|
|||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
@server.get('/api/presets')
|
@server.get('/api/presets')
|
||||||
def presets() -> Dict[str, Any]:
|
def presets():
|
||||||
presets_path = Path('config/presets.yaml')
|
presets_path = system_safe_path('config/presets.yaml')
|
||||||
with open(presets_path) as f:
|
with open(presets_path) as f:
|
||||||
loaded_presets = yaml.load(f, Loader=yaml.FullLoader)
|
loaded_presets = yaml.load(f, Loader=yaml.FullLoader)
|
||||||
return loaded_presets
|
return loaded_presets
|
||||||
@@ -295,12 +292,30 @@ def add_job_handler():
|
|||||||
err_msg = f"Error processing job data: {e}"
|
err_msg = f"Error processing job data: {e}"
|
||||||
return err_msg, 400
|
return err_msg, 400
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
traceback.print_exception(e)
|
|
||||||
err_msg = f"Unknown error processing data: {e}"
|
err_msg = f"Unknown error processing data: {e}"
|
||||||
return err_msg, 500
|
return err_msg, 500
|
||||||
|
|
||||||
try:
|
try:
|
||||||
return JobImportHandler.create_jobs_from_processed_data(processed_job_data)
|
loaded_project_local_path = processed_job_data['__loaded_project_local_path']
|
||||||
|
created_jobs = []
|
||||||
|
if processed_job_data.get("child_jobs"):
|
||||||
|
for child_job_diffs in processed_job_data["child_jobs"]:
|
||||||
|
processed_child_job_data = processed_job_data.copy()
|
||||||
|
processed_child_job_data.pop("child_jobs")
|
||||||
|
processed_child_job_data.update(child_job_diffs)
|
||||||
|
child_job = DistributedJobManager.create_render_job(processed_child_job_data, loaded_project_local_path)
|
||||||
|
created_jobs.append(child_job)
|
||||||
|
else:
|
||||||
|
new_job = DistributedJobManager.create_render_job(processed_job_data, loaded_project_local_path)
|
||||||
|
created_jobs.append(new_job)
|
||||||
|
|
||||||
|
# Save notes to .txt
|
||||||
|
if processed_job_data.get("notes"):
|
||||||
|
parent_dir = os.path.dirname(os.path.dirname(loaded_project_local_path))
|
||||||
|
notes_name = processed_job_data['name'] + "-notes.txt"
|
||||||
|
with open(os.path.join(parent_dir, notes_name), "w") as f:
|
||||||
|
f.write(processed_job_data["notes"])
|
||||||
|
return [x.json() for x in created_jobs]
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.exception(f"Error creating render job: {e}")
|
logger.exception(f"Error creating render job: {e}")
|
||||||
return 'unknown error', 500
|
return 'unknown error', 500
|
||||||
@@ -323,18 +338,13 @@ def cancel_job(job_id):
|
|||||||
@server.route('/api/job/<job_id>/delete', methods=['POST', 'GET'])
|
@server.route('/api/job/<job_id>/delete', methods=['POST', 'GET'])
|
||||||
def delete_job(job_id):
|
def delete_job(job_id):
|
||||||
try:
|
try:
|
||||||
if not request.args.get("confirm", False):
|
if not request.args.get('confirm', False):
|
||||||
return "Confirmation required to delete job", 400
|
return 'Confirmation required to delete job', 400
|
||||||
|
|
||||||
|
# Check if we can remove the 'output' directory
|
||||||
found_job = RenderQueue.job_with_id(job_id)
|
found_job = RenderQueue.job_with_id(job_id)
|
||||||
|
project_dir = os.path.dirname(os.path.dirname(found_job.input_path))
|
||||||
input_path = Path(found_job.input_path)
|
output_dir = os.path.dirname(found_job.output_path)
|
||||||
output_path = Path(found_job.output_path)
|
|
||||||
upload_root = Path(server.config["UPLOAD_FOLDER"])
|
|
||||||
|
|
||||||
project_dir = input_path.parent.parent
|
|
||||||
output_dir = output_path.parent
|
|
||||||
|
|
||||||
found_job.stop()
|
found_job.stop()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@@ -342,24 +352,25 @@ def delete_job(job_id):
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Error deleting previews for {found_job}: {e}")
|
logger.error(f"Error deleting previews for {found_job}: {e}")
|
||||||
|
|
||||||
|
# finally delete the job
|
||||||
RenderQueue.delete_job(found_job)
|
RenderQueue.delete_job(found_job)
|
||||||
|
|
||||||
# Delete output directory if we own it
|
# delete the output_dir
|
||||||
if output_dir.exists() and output_dir.is_relative_to(upload_root):
|
if server.config['UPLOAD_FOLDER'] in output_dir and os.path.exists(output_dir):
|
||||||
shutil.rmtree(output_dir)
|
shutil.rmtree(output_dir)
|
||||||
|
|
||||||
# Delete project directory if we own it and it's unused
|
# See if we own the project_dir (i.e. was it uploaded) - if so delete the directory
|
||||||
try:
|
try:
|
||||||
if project_dir.exists() and project_dir.is_relative_to(upload_root):
|
if server.config['UPLOAD_FOLDER'] in project_dir and os.path.exists(project_dir):
|
||||||
project_dir_files = [p for p in project_dir.iterdir() if not p.name.startswith(".")]
|
# check to see if any other projects are sharing the same project file
|
||||||
if not project_dir_files or (len(project_dir_files) == 1 and "source" in project_dir_files[0].name):
|
project_dir_files = [f for f in os.listdir(project_dir) if not f.startswith('.')]
|
||||||
|
if len(project_dir_files) == 0 or (len(project_dir_files) == 1 and 'source' in project_dir_files[0]):
|
||||||
logger.info(f"Removing project directory: {project_dir}")
|
logger.info(f"Removing project directory: {project_dir}")
|
||||||
shutil.rmtree(project_dir)
|
shutil.rmtree(project_dir)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Error removing project files: {e}")
|
logger.error(f"Error removing project files: {e}")
|
||||||
|
|
||||||
return "Job deleted", 200
|
return "Job deleted", 200
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Error deleting job: {e}")
|
logger.error(f"Error deleting job: {e}")
|
||||||
return f"Error deleting job: {e}", 500
|
return f"Error deleting job: {e}", 500
|
||||||
@@ -369,26 +380,6 @@ def delete_job(job_id):
|
|||||||
# Engine Info and Management:
|
# Engine Info and Management:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
@server.get('/api/engine_for_filename')
|
|
||||||
def get_engine_for_filename():
|
|
||||||
filename = request.args.get("filename")
|
|
||||||
if not filename:
|
|
||||||
return "Error: filename is required", 400
|
|
||||||
found_engine = EngineManager.engine_class_for_project_path(filename)
|
|
||||||
if not found_engine:
|
|
||||||
return f"Error: cannot find a suitable engine for '{filename}'", 400
|
|
||||||
return found_engine.name()
|
|
||||||
|
|
||||||
@server.get('/api/installed_engines')
|
|
||||||
def get_installed_engines():
|
|
||||||
result = {}
|
|
||||||
for engine_class in EngineManager.supported_engines():
|
|
||||||
data = EngineManager.all_version_data_for_engine(engine_class.name())
|
|
||||||
if data:
|
|
||||||
result[engine_class.name()] = data
|
|
||||||
return result
|
|
||||||
|
|
||||||
|
|
||||||
@server.get('/api/engine_info')
|
@server.get('/api/engine_info')
|
||||||
def engine_info():
|
def engine_info():
|
||||||
response_type = request.args.get('response_type', 'standard')
|
response_type = request.args.get('response_type', 'standard')
|
||||||
@@ -428,7 +419,6 @@ def engine_info():
|
|||||||
return result
|
return result
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
traceback.print_exc(e)
|
|
||||||
logger.error(f"Error fetching details for engine '{engine.name()}': {e}")
|
logger.error(f"Error fetching details for engine '{engine.name()}': {e}")
|
||||||
return {}
|
return {}
|
||||||
|
|
||||||
@@ -444,40 +434,6 @@ def engine_info():
|
|||||||
return engine_data
|
return engine_data
|
||||||
|
|
||||||
|
|
||||||
@server.get('/api/<engine_name>/info')
|
|
||||||
def get_engine_info(engine_name):
|
|
||||||
try:
|
|
||||||
response_type = request.args.get('response_type', 'standard')
|
|
||||||
# Get all installed versions of the engine
|
|
||||||
installed_versions = EngineManager.all_version_data_for_engine(engine_name)
|
|
||||||
if not installed_versions:
|
|
||||||
return {}
|
|
||||||
|
|
||||||
result = { 'is_available': RenderQueue.is_available_for_job(engine_name),
|
|
||||||
'versions': installed_versions
|
|
||||||
}
|
|
||||||
|
|
||||||
if response_type == 'full':
|
|
||||||
with concurrent.futures.ThreadPoolExecutor() as executor:
|
|
||||||
engine_class = EngineManager.engine_class_with_name(engine_name)
|
|
||||||
en = EngineManager.get_latest_engine_instance(engine_class)
|
|
||||||
future_results = {
|
|
||||||
'supported_extensions': executor.submit(en.supported_extensions),
|
|
||||||
'supported_export_formats': executor.submit(en.get_output_formats),
|
|
||||||
'system_info': executor.submit(en.system_info),
|
|
||||||
'options': executor.submit(en.ui_options)
|
|
||||||
}
|
|
||||||
|
|
||||||
for key, future in future_results.items():
|
|
||||||
result[key] = future.result()
|
|
||||||
|
|
||||||
return result
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"Error fetching details for engine '{engine_name}': {e}")
|
|
||||||
return {}
|
|
||||||
|
|
||||||
|
|
||||||
@server.get('/api/<engine_name>/is_available')
|
@server.get('/api/<engine_name>/is_available')
|
||||||
def is_engine_available(engine_name):
|
def is_engine_available(engine_name):
|
||||||
return {'engine': engine_name, 'available': RenderQueue.is_available_for_job(engine_name),
|
return {'engine': engine_name, 'available': RenderQueue.is_available_for_job(engine_name),
|
||||||
@@ -486,27 +442,6 @@ def is_engine_available(engine_name):
|
|||||||
'hostname': server.config['HOSTNAME']}
|
'hostname': server.config['HOSTNAME']}
|
||||||
|
|
||||||
|
|
||||||
@server.get('/api/engine/<engine_name>/args')
|
|
||||||
def get_engine_args(engine_name):
|
|
||||||
try:
|
|
||||||
engine_class = EngineManager.engine_class_with_name(engine_name)
|
|
||||||
return engine_class().get_arguments()
|
|
||||||
except LookupError:
|
|
||||||
return f"Cannot find engine '{engine_name}'", 400
|
|
||||||
|
|
||||||
|
|
||||||
@server.get('/api/engine/<engine_name>/help')
|
|
||||||
def get_engine_help(engine_name):
|
|
||||||
try:
|
|
||||||
engine_class = EngineManager.engine_class_with_name(engine_name)
|
|
||||||
return engine_class().get_help()
|
|
||||||
except LookupError:
|
|
||||||
return f"Cannot find engine '{engine_name}'", 400
|
|
||||||
|
|
||||||
# --------------------------------------------
|
|
||||||
# Engine Downloads and Updates:
|
|
||||||
# --------------------------------------------
|
|
||||||
|
|
||||||
@server.get('/api/is_engine_available_to_download')
|
@server.get('/api/is_engine_available_to_download')
|
||||||
def is_engine_available_to_download():
|
def is_engine_available_to_download():
|
||||||
available_result = EngineManager.version_is_available_to_download(request.args.get('engine'),
|
available_result = EngineManager.version_is_available_to_download(request.args.get('engine'),
|
||||||
@@ -547,6 +482,24 @@ def delete_engine_download():
|
|||||||
(f"Error deleting {json_data.get('engine')} {json_data.get('version')}", 500)
|
(f"Error deleting {json_data.get('engine')} {json_data.get('version')}", 500)
|
||||||
|
|
||||||
|
|
||||||
|
@server.get('/api/engine/<engine_name>/args')
|
||||||
|
def get_engine_args(engine_name):
|
||||||
|
try:
|
||||||
|
engine_class = EngineManager.engine_with_name(engine_name)
|
||||||
|
return engine_class().get_arguments()
|
||||||
|
except LookupError:
|
||||||
|
return f"Cannot find engine '{engine_name}'", 400
|
||||||
|
|
||||||
|
|
||||||
|
@server.get('/api/engine/<engine_name>/help')
|
||||||
|
def get_engine_help(engine_name):
|
||||||
|
try:
|
||||||
|
engine_class = EngineManager.engine_with_name(engine_name)
|
||||||
|
return engine_class().get_help()
|
||||||
|
except LookupError:
|
||||||
|
return f"Cannot find engine '{engine_name}'", 400
|
||||||
|
|
||||||
|
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
# Miscellaneous:
|
# Miscellaneous:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
@@ -621,18 +574,8 @@ def handle_detached_instance(_):
|
|||||||
return "Unavailable", 503
|
return "Unavailable", 503
|
||||||
|
|
||||||
|
|
||||||
@server.errorhandler(404)
|
|
||||||
def handle_404(error):
|
|
||||||
url = request.url
|
|
||||||
err_msg = f"404 Not Found: {url}"
|
|
||||||
if 'favicon' not in url:
|
|
||||||
logger.warning(err_msg)
|
|
||||||
return err_msg, 404
|
|
||||||
|
|
||||||
|
|
||||||
@server.errorhandler(Exception)
|
@server.errorhandler(Exception)
|
||||||
def handle_general_error(general_error):
|
def handle_general_error(general_error):
|
||||||
traceback.print_exception(type(general_error), general_error, general_error.__traceback__)
|
|
||||||
err_msg = f"Server error: {general_error}"
|
err_msg = f"Server error: {general_error}"
|
||||||
logger.error(err_msg)
|
logger.error(err_msg)
|
||||||
return err_msg, 500
|
return err_msg, 500
|
||||||
|
|||||||
@@ -5,77 +5,18 @@ import shutil
|
|||||||
import tempfile
|
import tempfile
|
||||||
import zipfile
|
import zipfile
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
from tqdm import tqdm
|
from tqdm import tqdm
|
||||||
from werkzeug.utils import secure_filename
|
from werkzeug.utils import secure_filename
|
||||||
|
|
||||||
from distributed_job_manager import DistributedJobManager
|
|
||||||
|
|
||||||
logger = logging.getLogger()
|
logger = logging.getLogger()
|
||||||
|
|
||||||
|
|
||||||
class JobImportHandler:
|
class JobImportHandler:
|
||||||
"""Handles job import operations for rendering projects.
|
|
||||||
|
|
||||||
This class provides functionality to validate, download, and process
|
|
||||||
job data and project files for the rendering queue system.
|
|
||||||
"""
|
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def create_jobs_from_processed_data(cls, processed_job_data: dict) -> list[dict]:
|
def validate_job_data(cls, new_job_data, upload_directory, uploaded_file=None):
|
||||||
""" Takes processed job data and creates new jobs
|
|
||||||
|
|
||||||
Args: processed_job_data: Dictionary containing job information"""
|
|
||||||
loaded_project_local_path = processed_job_data['__loaded_project_local_path']
|
|
||||||
|
|
||||||
# prepare child job data
|
|
||||||
job_data_to_create = []
|
|
||||||
if processed_job_data.get("child_jobs"):
|
|
||||||
for child_job_diffs in processed_job_data["child_jobs"]:
|
|
||||||
processed_child_job_data = processed_job_data.copy()
|
|
||||||
processed_child_job_data.pop("child_jobs")
|
|
||||||
processed_child_job_data.update(child_job_diffs)
|
|
||||||
job_data_to_create.append(processed_child_job_data)
|
|
||||||
else:
|
|
||||||
job_data_to_create.append(processed_job_data)
|
|
||||||
|
|
||||||
# create the jobs
|
|
||||||
created_jobs = []
|
|
||||||
for job_data in job_data_to_create:
|
|
||||||
new_job = DistributedJobManager.create_render_job(job_data, loaded_project_local_path)
|
|
||||||
created_jobs.append(new_job)
|
|
||||||
|
|
||||||
# Save notes to .txt
|
|
||||||
if processed_job_data.get("notes"):
|
|
||||||
parent_dir = Path(loaded_project_local_path).parent.parent
|
|
||||||
notes_name = processed_job_data['name'] + "-notes.txt"
|
|
||||||
with (Path(parent_dir) / notes_name).open("w") as f:
|
|
||||||
f.write(processed_job_data["notes"])
|
|
||||||
return [x.json() for x in created_jobs]
|
|
||||||
|
|
||||||
@classmethod
|
|
||||||
def validate_job_data(cls, new_job_data: dict, upload_directory: Path, uploaded_file=None) -> dict:
|
|
||||||
"""Validates and prepares job data for import.
|
|
||||||
|
|
||||||
This method validates the job data dictionary, handles project file
|
|
||||||
acquisition (upload, download, or local copy), and prepares the job
|
|
||||||
directory structure.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
new_job_data: Dictionary containing job information including
|
|
||||||
'name', 'engine_name', and optionally 'url' or 'local_path'.
|
|
||||||
upload_directory: Base directory for storing uploaded jobs.
|
|
||||||
uploaded_file: Optional uploaded file object from the request.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
The validated job data dictionary with additional metadata.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
KeyError: If required fields 'name' or 'engine_name' are missing.
|
|
||||||
FileNotFoundError: If no valid project file can be found.
|
|
||||||
"""
|
|
||||||
loaded_project_local_path = None
|
loaded_project_local_path = None
|
||||||
|
|
||||||
# check for required keys
|
# check for required keys
|
||||||
@@ -83,7 +24,7 @@ class JobImportHandler:
|
|||||||
engine_name = new_job_data.get('engine_name')
|
engine_name = new_job_data.get('engine_name')
|
||||||
if not job_name:
|
if not job_name:
|
||||||
raise KeyError("Missing job name")
|
raise KeyError("Missing job name")
|
||||||
if not engine_name:
|
elif not engine_name:
|
||||||
raise KeyError("Missing engine name")
|
raise KeyError("Missing engine name")
|
||||||
|
|
||||||
project_url = new_job_data.get('url', None)
|
project_url = new_job_data.get('url', None)
|
||||||
@@ -103,33 +44,27 @@ class JobImportHandler:
|
|||||||
|
|
||||||
# Prepare the local filepath
|
# Prepare the local filepath
|
||||||
cleaned_path_name = job_name.replace(' ', '-')
|
cleaned_path_name = job_name.replace(' ', '-')
|
||||||
folder_name = f"{cleaned_path_name}-{engine_name}-{datetime.now().strftime('%Y.%m.%d_%H.%M.%S')}"
|
job_dir = os.path.join(upload_directory, '-'.join(
|
||||||
job_dir = Path(upload_directory) / folder_name
|
[cleaned_path_name, engine_name, datetime.now().strftime("%Y.%m.%d_%H.%M.%S")]))
|
||||||
os.makedirs(job_dir, exist_ok=True)
|
os.makedirs(job_dir, exist_ok=True)
|
||||||
project_source_dir = Path(job_dir) / 'source'
|
project_source_dir = os.path.join(job_dir, 'source')
|
||||||
os.makedirs(project_source_dir, exist_ok=True)
|
os.makedirs(project_source_dir, exist_ok=True)
|
||||||
|
|
||||||
# Move projects to their work directories
|
# Move projects to their work directories
|
||||||
if uploaded_file and uploaded_file.filename:
|
if uploaded_file and uploaded_file.filename:
|
||||||
# Handle file uploading
|
loaded_project_local_path = os.path.join(project_source_dir, secure_filename(uploaded_file.filename))
|
||||||
filename = secure_filename(uploaded_file.filename)
|
uploaded_file.save(loaded_project_local_path)
|
||||||
loaded_project_local_path = Path(project_source_dir) / filename
|
logger.info(f"Transfer complete for {loaded_project_local_path.split(upload_directory)[-1]}")
|
||||||
uploaded_file.save(str(loaded_project_local_path))
|
|
||||||
logger.info(f"Transfer complete for {loaded_project_local_path.relative_to(upload_directory)}")
|
|
||||||
|
|
||||||
elif project_url:
|
elif project_url:
|
||||||
# Handle downloading project from a URL
|
loaded_project_local_path = os.path.join(project_source_dir, referred_name)
|
||||||
loaded_project_local_path = Path(project_source_dir) / referred_name
|
shutil.move(downloaded_file_url, loaded_project_local_path)
|
||||||
shutil.move(str(downloaded_file_url), str(loaded_project_local_path))
|
logger.info(f"Download complete for {loaded_project_local_path.split(upload_directory)[-1]}")
|
||||||
logger.info(f"Download complete for {loaded_project_local_path.relative_to(upload_directory)}")
|
|
||||||
|
|
||||||
elif local_path:
|
elif local_path:
|
||||||
# Handle local files
|
loaded_project_local_path = os.path.join(project_source_dir, referred_name)
|
||||||
loaded_project_local_path = Path(project_source_dir) / referred_name
|
shutil.copy(local_path, loaded_project_local_path)
|
||||||
shutil.copy(str(local_path), str(loaded_project_local_path))
|
logger.info(f"Import complete for {loaded_project_local_path.split(upload_directory)[-1]}")
|
||||||
logger.info(f"Import complete for {loaded_project_local_path.relative_to(upload_directory)}")
|
|
||||||
|
|
||||||
if loaded_project_local_path.suffix == ".zip":
|
if loaded_project_local_path.lower().endswith('.zip'):
|
||||||
loaded_project_local_path = cls.process_zipped_project(loaded_project_local_path)
|
loaded_project_local_path = cls.process_zipped_project(loaded_project_local_path)
|
||||||
|
|
||||||
new_job_data["__loaded_project_local_path"] = loaded_project_local_path
|
new_job_data["__loaded_project_local_path"] = loaded_project_local_path
|
||||||
@@ -137,25 +72,13 @@ class JobImportHandler:
|
|||||||
return new_job_data
|
return new_job_data
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def download_project_from_url(project_url: str):
|
def download_project_from_url(project_url):
|
||||||
"""Downloads a project file from the given URL.
|
|
||||||
|
|
||||||
Downloads the file from the specified URL to a temporary directory
|
|
||||||
with progress tracking. Returns the filename and temporary path.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
project_url: The URL to download the project file from.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
A tuple of (filename, temp_file_path) if successful,
|
|
||||||
(None, None) if download fails.
|
|
||||||
"""
|
|
||||||
# This nested function is to handle downloading from a URL
|
# This nested function is to handle downloading from a URL
|
||||||
logger.info(f"Downloading project from url: {project_url}")
|
logger.info(f"Downloading project from url: {project_url}")
|
||||||
referred_name = os.path.basename(project_url)
|
referred_name = os.path.basename(project_url)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
response = requests.get(project_url, stream=True, timeout=300)
|
response = requests.get(project_url, stream=True)
|
||||||
if response.status_code == 200:
|
if response.status_code == 200:
|
||||||
# Get the total file size from the "Content-Length" header
|
# Get the total file size from the "Content-Length" header
|
||||||
file_size = int(response.headers.get("Content-Length", 0))
|
file_size = int(response.headers.get("Content-Length", 0))
|
||||||
@@ -178,29 +101,30 @@ class JobImportHandler:
|
|||||||
return None, None
|
return None, None
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def process_zipped_project(zip_path: Path) -> Path:
|
def process_zipped_project(zip_path):
|
||||||
"""
|
"""
|
||||||
Processes a zipped project.
|
Processes a zipped project.
|
||||||
|
|
||||||
This method takes a path to a zip file, extracts its contents, and returns the path to the extracted project
|
This method takes a path to a zip file, extracts its contents, and returns the path to the extracted project file.
|
||||||
file. If the zip file contains more than one project file or none, an error is raised.
|
If the zip file contains more than one project file or none, an error is raised.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
zip_path (Path): The path to the zip file.
|
zip_path (str): The path to the zip file.
|
||||||
|
|
||||||
Raises:
|
Raises:
|
||||||
ValueError: If there's more than 1 project file or none in the zip file.
|
ValueError: If there's more than 1 project file or none in the zip file.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Path: The path to the main project file.
|
str: The path to the main project file.
|
||||||
"""
|
"""
|
||||||
work_path = zip_path.parent
|
work_path = os.path.dirname(zip_path)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
with zipfile.ZipFile(zip_path, 'r') as myzip:
|
with zipfile.ZipFile(zip_path, 'r') as myzip:
|
||||||
myzip.extractall(str(work_path))
|
myzip.extractall(work_path)
|
||||||
|
|
||||||
project_files = [p for p in work_path.iterdir() if p.is_file() and p.suffix.lower() != ".zip"]
|
project_files = [x for x in os.listdir(work_path) if os.path.isfile(os.path.join(work_path, x))]
|
||||||
|
project_files = [x for x in project_files if '.zip' not in x]
|
||||||
|
|
||||||
logger.debug(f"Zip files: {project_files}")
|
logger.debug(f"Zip files: {project_files}")
|
||||||
|
|
||||||
@@ -210,12 +134,12 @@ class JobImportHandler:
|
|||||||
|
|
||||||
# If there's more than 1 project file or none, raise an error
|
# If there's more than 1 project file or none, raise an error
|
||||||
if len(project_files) != 1:
|
if len(project_files) != 1:
|
||||||
raise ValueError(f'Cannot find a valid project file in {zip_path.name}')
|
raise ValueError(f'Cannot find a valid project file in {os.path.basename(zip_path)}')
|
||||||
|
|
||||||
extracted_project_path = work_path / project_files[0]
|
extracted_project_path = os.path.join(work_path, project_files[0])
|
||||||
logger.info(f"Extracted zip file to {extracted_project_path}")
|
logger.info(f"Extracted zip file to {extracted_project_path}")
|
||||||
|
|
||||||
except (zipfile.BadZipFile, zipfile.LargeZipFile) as e:
|
except (zipfile.BadZipFile, zipfile.LargeZipFile) as e:
|
||||||
logger.error(f"Error processing zip file: {e}")
|
logger.error(f"Error processing zip file: {e}")
|
||||||
raise ValueError(f'Error processing zip file: {e}') from e
|
raise ValueError(f"Error processing zip file: {e}")
|
||||||
return extracted_project_path
|
return extracted_project_path
|
||||||
|
|||||||
@@ -12,20 +12,12 @@ supported_image_formats = ['.jpg', '.png', '.exr', '.tif', '.tga', '.bmp', '.web
|
|||||||
|
|
||||||
|
|
||||||
class PreviewManager:
|
class PreviewManager:
|
||||||
"""Manages generation, storage, and retrieval of preview images and videos for rendering jobs."""
|
|
||||||
|
|
||||||
storage_path = None
|
storage_path = None
|
||||||
_running_jobs = {}
|
_running_jobs = {}
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def __generate_job_preview_worker(cls, job, replace_existing=False, max_width=480):
|
def __generate_job_preview_worker(cls, job, replace_existing=False, max_width=480):
|
||||||
"""Generates image and video previews for a given job.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
job: The job object containing file information.
|
|
||||||
replace_existing (bool): Whether to replace existing previews. Defaults to False.
|
|
||||||
max_width (int): Maximum width for the preview images/videos. Defaults to 480.
|
|
||||||
"""
|
|
||||||
|
|
||||||
# Determine best source file to use for thumbs
|
# Determine best source file to use for thumbs
|
||||||
job_file_list = job.file_list()
|
job_file_list = job.file_list()
|
||||||
@@ -75,36 +67,20 @@ class PreviewManager:
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def update_previews_for_job(cls, job, replace_existing=False, wait_until_completion=False, timeout=None):
|
def update_previews_for_job(cls, job, replace_existing=False, wait_until_completion=False, timeout=None):
|
||||||
"""Updates previews for a given job by starting a background thread.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
job: The job object.
|
|
||||||
replace_existing (bool): Whether to replace existing previews. Defaults to False.
|
|
||||||
wait_until_completion (bool): Whether to wait for the thread to complete. Defaults to False.
|
|
||||||
timeout (float): Timeout for waiting, if applicable.
|
|
||||||
"""
|
|
||||||
job_thread = cls._running_jobs.get(job.id)
|
job_thread = cls._running_jobs.get(job.id)
|
||||||
if job_thread and job_thread.is_alive():
|
if job_thread and job_thread.is_alive():
|
||||||
logger.debug(f'Preview generation job already running for {job}')
|
logger.debug(f'Preview generation job already running for {job}')
|
||||||
return
|
else:
|
||||||
|
job_thread = threading.Thread(target=cls.__generate_job_preview_worker, args=(job, replace_existing,))
|
||||||
job_thread = threading.Thread(target=cls.__generate_job_preview_worker, args=(job, replace_existing,))
|
job_thread.start()
|
||||||
job_thread.start()
|
cls._running_jobs[job.id] = job_thread
|
||||||
cls._running_jobs[job.id] = job_thread
|
|
||||||
|
|
||||||
if wait_until_completion:
|
if wait_until_completion:
|
||||||
job_thread.join(timeout=timeout)
|
job_thread.join(timeout=timeout)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def get_previews_for_job(cls, job):
|
def get_previews_for_job(cls, job):
|
||||||
"""Retrieves previews for a given job.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
job: The job object.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
dict: A dictionary containing preview information.
|
|
||||||
"""
|
|
||||||
results = {}
|
results = {}
|
||||||
try:
|
try:
|
||||||
directory_path = Path(cls.storage_path)
|
directory_path = Path(cls.storage_path)
|
||||||
@@ -127,11 +103,6 @@ class PreviewManager:
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def delete_previews_for_job(cls, job):
|
def delete_previews_for_job(cls, job):
|
||||||
"""Deletes all previews associated with a given job.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
job: The job object.
|
|
||||||
"""
|
|
||||||
all_previews = cls.get_previews_for_job(job)
|
all_previews = cls.get_previews_for_job(job)
|
||||||
flattened_list = [item for sublist in all_previews.values() for item in sublist]
|
flattened_list = [item for sublist in all_previews.values() for item in sublist]
|
||||||
for preview in flattened_list:
|
for preview in flattened_list:
|
||||||
|
|||||||
@@ -3,7 +3,6 @@ import logging
|
|||||||
import os
|
import os
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
from requests_toolbelt.multipart import MultipartEncoder, MultipartEncoderMonitor
|
from requests_toolbelt.multipart import MultipartEncoder, MultipartEncoderMonitor
|
||||||
@@ -185,12 +184,12 @@ class RenderServerProxy:
|
|||||||
# Job Lifecycle:
|
# Job Lifecycle:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def post_job_to_server(self, file_path: Path, job_data, callback=None):
|
def post_job_to_server(self, file_path, job_data, callback=None):
|
||||||
"""
|
"""
|
||||||
Posts a job to the server.
|
Posts a job to the server.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
file_path (Path): The path to the file to upload.
|
file_path (str): The path to the file to upload.
|
||||||
job_data (dict): A dict of jobs data.
|
job_data (dict): A dict of jobs data.
|
||||||
callback (function, optional): A callback function to call during the upload. Defaults to None.
|
callback (function, optional): A callback function to call during the upload. Defaults to None.
|
||||||
|
|
||||||
@@ -198,12 +197,12 @@ class RenderServerProxy:
|
|||||||
Response: The response from the server.
|
Response: The response from the server.
|
||||||
"""
|
"""
|
||||||
# Check if file exists
|
# Check if file exists
|
||||||
if not file_path.exists():
|
if not os.path.exists(file_path):
|
||||||
raise FileNotFoundError(f"File not found: {file_path}")
|
raise FileNotFoundError(f"File not found: {file_path}")
|
||||||
|
|
||||||
# Bypass uploading file if posting to localhost
|
# Bypass uploading file if posting to localhost
|
||||||
if self.is_localhost:
|
if self.is_localhost:
|
||||||
job_data['local_path'] = str(file_path)
|
job_data['local_path'] = file_path
|
||||||
url = urljoin(f'http://{self.hostname}:{self.port}', '/api/add_job')
|
url = urljoin(f'http://{self.hostname}:{self.port}', '/api/add_job')
|
||||||
headers = {'Content-Type': 'application/json'}
|
headers = {'Content-Type': 'application/json'}
|
||||||
return requests.post(url, data=json.dumps(job_data), headers=headers)
|
return requests.post(url, data=json.dumps(job_data), headers=headers)
|
||||||
@@ -211,7 +210,7 @@ class RenderServerProxy:
|
|||||||
# Prepare the form data for remote host
|
# Prepare the form data for remote host
|
||||||
with open(file_path, 'rb') as file:
|
with open(file_path, 'rb') as file:
|
||||||
encoder = MultipartEncoder({
|
encoder = MultipartEncoder({
|
||||||
'file': (file_path.name, file, 'application/octet-stream'),
|
'file': (os.path.basename(file_path), file, 'application/octet-stream'),
|
||||||
'json': (None, json.dumps(job_data), 'application/json'),
|
'json': (None, json.dumps(job_data), 'application/json'),
|
||||||
})
|
})
|
||||||
|
|
||||||
@@ -248,19 +247,16 @@ class RenderServerProxy:
|
|||||||
# Engines:
|
# Engines:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def get_engine_for_filename(self, filename:str, timeout=5):
|
def is_engine_available(self, engine_name):
|
||||||
response = self.request(f'engine_for_filename?filename={os.path.basename(filename)}', timeout)
|
return self.request_data(f'{engine_name}/is_available')
|
||||||
return response.text
|
|
||||||
|
|
||||||
def get_installed_engines(self, timeout=5):
|
def get_all_engines(self):
|
||||||
return self.request_data(f'installed_engines', timeout)
|
# todo: this doesnt work
|
||||||
|
return self.request_data('all_engines')
|
||||||
|
|
||||||
def is_engine_available(self, engine_name:str, timeout=5):
|
def get_engine_info(self, response_type='standard', timeout=5):
|
||||||
return self.request_data(f'{engine_name}/is_available', timeout)
|
|
||||||
|
|
||||||
def get_all_engine_info(self, response_type='standard', timeout=5):
|
|
||||||
"""
|
"""
|
||||||
Fetches all engine information from the server.
|
Fetches engine information from the server.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
response_type (str, optional): Returns standard or full version of engine info
|
response_type (str, optional): Returns standard or full version of engine info
|
||||||
@@ -272,33 +268,19 @@ class RenderServerProxy:
|
|||||||
all_data = self.request_data(f"engine_info?response_type={response_type}", timeout=timeout)
|
all_data = self.request_data(f"engine_info?response_type={response_type}", timeout=timeout)
|
||||||
return all_data
|
return all_data
|
||||||
|
|
||||||
def get_engine_info(self, engine_name:str, response_type='standard', timeout=5):
|
def delete_engine(self, engine, version, system_cpu=None):
|
||||||
"""
|
|
||||||
Fetches specific engine information from the server.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine_name (str): Name of the engine
|
|
||||||
response_type (str, optional): Returns standard or full version of engine info
|
|
||||||
timeout (int, optional): The number of seconds to wait for a response from the server. Defaults to 5.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
dict: A dictionary containing the engine information.
|
|
||||||
"""
|
|
||||||
return self.request_data(f'{engine_name}/info?response_type={response_type}', timeout)
|
|
||||||
|
|
||||||
def delete_engine(self, engine_name:str, version:str, system_cpu=None):
|
|
||||||
"""
|
"""
|
||||||
Sends a request to the server to delete a specific engine.
|
Sends a request to the server to delete a specific engine.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
engine_name (str): The name of the engine to delete.
|
engine (str): The name of the engine to delete.
|
||||||
version (str): The version of the engine to delete.
|
version (str): The version of the engine to delete.
|
||||||
system_cpu (str, optional): The system CPU type. Defaults to None.
|
system_cpu (str, optional): The system CPU type. Defaults to None.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Response: The response from the server.
|
Response: The response from the server.
|
||||||
"""
|
"""
|
||||||
form_data = {'engine': engine_name, 'version': version, 'system_cpu': system_cpu}
|
form_data = {'engine': engine, 'version': version, 'system_cpu': system_cpu}
|
||||||
return requests.post(f'http://{self.hostname}:{self.port}/api/delete_engine', json=form_data)
|
return requests.post(f'http://{self.hostname}:{self.port}/api/delete_engine', json=form_data)
|
||||||
|
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|||||||
@@ -4,7 +4,6 @@ import socket
|
|||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
|
|
||||||
from click import Path
|
|
||||||
from plyer import notification
|
from plyer import notification
|
||||||
from pubsub import pub
|
from pubsub import pub
|
||||||
|
|
||||||
@@ -71,7 +70,7 @@ class DistributedJobManager:
|
|||||||
logger.error(f"Error notifying parent {parent_hostname} about update in subjob {render_job.id}: {e}")
|
logger.error(f"Error notifying parent {parent_hostname} about update in subjob {render_job.id}: {e}")
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def __local_job_status_changed(cls, job_id: str, old_status: str, new_status: str):
|
def __local_job_status_changed(cls, job_id, old_status, new_status):
|
||||||
"""
|
"""
|
||||||
Responds to the 'status_change' pubsub message for local jobs.
|
Responds to the 'status_change' pubsub message for local jobs.
|
||||||
If it's a child job, it notifies the parent job about the status change.
|
If it's a child job, it notifies the parent job about the status change.
|
||||||
@@ -130,13 +129,13 @@ class DistributedJobManager:
|
|||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def create_render_job(cls, new_job_attributes: dict, loaded_project_local_path: Path):
|
def create_render_job(cls, new_job_attributes, loaded_project_local_path):
|
||||||
"""Creates render jobs. Pass in dict of job_data and the local path to the project. It creates and returns a new
|
"""Creates render jobs. Pass in dict of job_data and the local path to the project. It creates and returns a new
|
||||||
render job.
|
render job.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
new_job_attributes (dict): Dict of desired attributes for new job (frame count, renderer, output path, etc)
|
new_job_attributes (dict): Dict of desired attributes for new job (frame count, renderer, output path, etc)
|
||||||
loaded_project_local_path (Path): The local path to the loaded project.
|
loaded_project_local_path (str): The local path to the loaded project.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
worker: Created job worker
|
worker: Created job worker
|
||||||
@@ -144,11 +143,15 @@ class DistributedJobManager:
|
|||||||
|
|
||||||
# get new output path in output_dir
|
# get new output path in output_dir
|
||||||
output_path = new_job_attributes.get('output_path')
|
output_path = new_job_attributes.get('output_path')
|
||||||
output_filename = loaded_project_local_path.name if output_path else loaded_project_local_path.stem
|
if not output_path:
|
||||||
|
loaded_project_filename = os.path.basename(loaded_project_local_path)
|
||||||
|
output_filename = os.path.splitext(loaded_project_filename)[0]
|
||||||
|
else:
|
||||||
|
output_filename = os.path.basename(output_path)
|
||||||
|
|
||||||
# Prepare output path
|
# Prepare output path
|
||||||
output_dir = loaded_project_local_path.parent.parent / "output"
|
output_dir = os.path.join(os.path.dirname(os.path.dirname(loaded_project_local_path)), 'output')
|
||||||
output_path = output_dir / output_filename
|
output_path = os.path.join(output_dir, output_filename)
|
||||||
os.makedirs(output_dir, exist_ok=True)
|
os.makedirs(output_dir, exist_ok=True)
|
||||||
logger.debug(f"New job output path: {output_path}")
|
logger.debug(f"New job output path: {output_path}")
|
||||||
|
|
||||||
@@ -183,7 +186,7 @@ class DistributedJobManager:
|
|||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def handle_subjob_update_notification(cls, local_job, subjob_data: dict):
|
def handle_subjob_update_notification(cls, local_job, subjob_data):
|
||||||
"""Responds to a notification from a remote subjob and the host requests any subsequent updates from the subjob.
|
"""Responds to a notification from a remote subjob and the host requests any subsequent updates from the subjob.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@@ -344,7 +347,7 @@ class DistributedJobManager:
|
|||||||
RenderServerProxy(parent_worker.hostname).cancel_job(parent_worker.id, confirm=True)
|
RenderServerProxy(parent_worker.hostname).cancel_job(parent_worker.id, confirm=True)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def __create_subjob(new_job_attributes: dict, project_path, server_data, server_hostname: str, parent_worker):
|
def __create_subjob(new_job_attributes, project_path, server_data, server_hostname, parent_worker):
|
||||||
"""Convenience method to create subjobs for a parent worker"""
|
"""Convenience method to create subjobs for a parent worker"""
|
||||||
subjob = new_job_attributes.copy()
|
subjob = new_job_attributes.copy()
|
||||||
subjob['name'] = f"{parent_worker.name}[{server_data['frame_range'][0]}-{server_data['frame_range'][-1]}]"
|
subjob['name'] = f"{parent_worker.name}[{server_data['frame_range'][0]}-{server_data['frame_range'][-1]}]"
|
||||||
@@ -363,7 +366,7 @@ class DistributedJobManager:
|
|||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def find_available_servers(engine_name: str, system_os=None):
|
def find_available_servers(engine_name, system_os=None):
|
||||||
"""
|
"""
|
||||||
Scan the Zeroconf network for currently available render servers supporting a specific engine.
|
Scan the Zeroconf network for currently available render servers supporting a specific engine.
|
||||||
|
|
||||||
@@ -372,16 +375,16 @@ class DistributedJobManager:
|
|||||||
:return: A list of dictionaries with each dict containing hostname and cpu_count of available servers
|
:return: A list of dictionaries with each dict containing hostname and cpu_count of available servers
|
||||||
"""
|
"""
|
||||||
from api.api_server import API_VERSION
|
from api.api_server import API_VERSION
|
||||||
found_available_servers = []
|
available_servers = []
|
||||||
for hostname in ZeroconfServer.found_hostnames():
|
for hostname in ZeroconfServer.found_hostnames():
|
||||||
host_properties = ZeroconfServer.get_hostname_properties(hostname)
|
host_properties = ZeroconfServer.get_hostname_properties(hostname)
|
||||||
if host_properties.get('api_version') == API_VERSION:
|
if host_properties.get('api_version') == API_VERSION:
|
||||||
if not system_os or (system_os and system_os == host_properties.get('system_os')):
|
if not system_os or (system_os and system_os == host_properties.get('system_os')):
|
||||||
response = RenderServerProxy(hostname).is_engine_available(engine_name)
|
response = RenderServerProxy(hostname).is_engine_available(engine_name)
|
||||||
if response and response.get('available', False):
|
if response and response.get('available', False):
|
||||||
found_available_servers.append(response)
|
available_servers.append(response)
|
||||||
|
|
||||||
return found_available_servers
|
return available_servers
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
import json
|
import json
|
||||||
import re
|
import re
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
from src.engines.core.base_engine import *
|
from src.engines.core.base_engine import *
|
||||||
|
from src.utilities.misc_helper import system_safe_path
|
||||||
|
|
||||||
logger = logging.getLogger()
|
logger = logging.getLogger()
|
||||||
|
|
||||||
@@ -24,12 +24,10 @@ class Blender(BaseRenderEngine):
|
|||||||
from src.engines.blender.blender_worker import BlenderRenderWorker
|
from src.engines.blender.blender_worker import BlenderRenderWorker
|
||||||
return BlenderRenderWorker
|
return BlenderRenderWorker
|
||||||
|
|
||||||
def ui_options(self):
|
@staticmethod
|
||||||
options = [
|
def ui_options(system_info):
|
||||||
{'name': 'engine', 'options': self.supported_render_engines()},
|
from src.engines.blender.blender_ui import BlenderUI
|
||||||
{'name': 'render_device', 'options': ['Any', 'GPU', 'CPU']},
|
return BlenderUI.get_options(system_info)
|
||||||
]
|
|
||||||
return options
|
|
||||||
|
|
||||||
def supported_extensions(self):
|
def supported_extensions(self):
|
||||||
return ['blend']
|
return ['blend']
|
||||||
@@ -89,7 +87,7 @@ class Blender(BaseRenderEngine):
|
|||||||
scene_info = {}
|
scene_info = {}
|
||||||
try:
|
try:
|
||||||
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'get_file_info.py')
|
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'get_file_info.py')
|
||||||
results = self.run_python_script(project_path=project_path, script_path=Path(script_path),
|
results = self.run_python_script(project_path=project_path, script_path=system_safe_path(script_path),
|
||||||
timeout=timeout)
|
timeout=timeout)
|
||||||
result_text = results.stdout.decode()
|
result_text = results.stdout.decode()
|
||||||
for line in result_text.splitlines():
|
for line in result_text.splitlines():
|
||||||
@@ -110,7 +108,7 @@ class Blender(BaseRenderEngine):
|
|||||||
try:
|
try:
|
||||||
logger.info(f"Starting to pack Blender file: {project_path}")
|
logger.info(f"Starting to pack Blender file: {project_path}")
|
||||||
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'pack_project.py')
|
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'pack_project.py')
|
||||||
results = self.run_python_script(project_path=project_path, script_path=Path(script_path),
|
results = self.run_python_script(project_path=project_path, script_path=system_safe_path(script_path),
|
||||||
timeout=timeout)
|
timeout=timeout)
|
||||||
|
|
||||||
result_text = results.stdout.decode()
|
result_text = results.stdout.decode()
|
||||||
@@ -182,7 +180,7 @@ class Blender(BaseRenderEngine):
|
|||||||
logger.error("GPU data not found in the output.")
|
logger.error("GPU data not found in the output.")
|
||||||
|
|
||||||
def supported_render_engines(self):
|
def supported_render_engines(self):
|
||||||
engine_output = subprocess.run([self.engine_path(), '-b', '-E', 'help'], timeout=SUBPROCESS_TIMEOUT,
|
engine_output = subprocess.run([self.engine_path(), '-E', 'help'], timeout=SUBPROCESS_TIMEOUT,
|
||||||
capture_output=True, creationflags=_creationflags).stdout.decode('utf-8').strip()
|
capture_output=True, creationflags=_creationflags).stdout.decode('utf-8').strip()
|
||||||
render_engines = [x.strip() for x in engine_output.split('Blender Engine Listing:')[-1].strip().splitlines()]
|
render_engines = [x.strip() for x in engine_output.split('Blender Engine Listing:')[-1].strip().splitlines()]
|
||||||
return render_engines
|
return render_engines
|
||||||
|
|||||||
9
src/engines/blender/blender_ui.py
Normal file
9
src/engines/blender/blender_ui.py
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
|
||||||
|
class BlenderUI:
|
||||||
|
@staticmethod
|
||||||
|
def get_options(system_info):
|
||||||
|
options = [
|
||||||
|
{'name': 'engine', 'options': system_info.get('engines', [])},
|
||||||
|
{'name': 'render_device', 'options': ['Any', 'GPU', 'CPU']},
|
||||||
|
]
|
||||||
|
return options
|
||||||
@@ -1,10 +1,6 @@
|
|||||||
import bpy
|
import bpy
|
||||||
import json
|
import json
|
||||||
|
|
||||||
# Force CPU rendering
|
|
||||||
bpy.context.preferences.addons["cycles"].preferences.compute_device_type = "NONE"
|
|
||||||
bpy.context.scene.cycles.device = "CPU"
|
|
||||||
|
|
||||||
# Ensure Cycles is available
|
# Ensure Cycles is available
|
||||||
bpy.context.preferences.addons['cycles'].preferences.get_devices()
|
bpy.context.preferences.addons['cycles'].preferences.get_devices()
|
||||||
|
|
||||||
|
|||||||
@@ -2,10 +2,9 @@ import logging
|
|||||||
import os
|
import os
|
||||||
import platform
|
import platform
|
||||||
import subprocess
|
import subprocess
|
||||||
from typing import Optional, List, Dict, Any, Type
|
|
||||||
|
|
||||||
logger = logging.getLogger()
|
logger = logging.getLogger()
|
||||||
SUBPROCESS_TIMEOUT = 10
|
SUBPROCESS_TIMEOUT = 5
|
||||||
|
|
||||||
|
|
||||||
class BaseRenderEngine(object):
|
class BaseRenderEngine(object):
|
||||||
@@ -18,23 +17,14 @@ class BaseRenderEngine(object):
|
|||||||
executable on different operating systems or environments.
|
executable on different operating systems or environments.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
install_paths: List[str] = []
|
install_paths = []
|
||||||
binary_names: Dict[str, str] = {}
|
|
||||||
|
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
# Required Overrides for Subclasses:
|
# Required Overrides for Subclasses:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def __init__(self, custom_path: Optional[str] = None) -> None:
|
def __init__(self, custom_path=None):
|
||||||
"""Initialize the render engine.
|
self.custom_engine_path = custom_path
|
||||||
|
|
||||||
Args:
|
|
||||||
custom_path: Optional custom path to the engine executable.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
FileNotFoundError: If the engine executable cannot be found.
|
|
||||||
"""
|
|
||||||
self.custom_engine_path: Optional[str] = custom_path
|
|
||||||
if not self.engine_path() or not os.path.exists(self.engine_path()):
|
if not self.engine_path() or not os.path.exists(self.engine_path()):
|
||||||
raise FileNotFoundError(f"Cannot find path to engine for {self.name()} instance: {self.engine_path()}")
|
raise FileNotFoundError(f"Cannot find path to engine for {self.name()} instance: {self.engine_path()}")
|
||||||
|
|
||||||
@@ -42,7 +32,7 @@ class BaseRenderEngine(object):
|
|||||||
logger.warning(f"Path is not executable. Setting permissions to 755 for {self.engine_path()}")
|
logger.warning(f"Path is not executable. Setting permissions to 755 for {self.engine_path()}")
|
||||||
os.chmod(self.engine_path(), 0o755)
|
os.chmod(self.engine_path(), 0o755)
|
||||||
|
|
||||||
def version(self) -> str:
|
def version(self):
|
||||||
"""Return the version number as a string.
|
"""Return the version number as a string.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
@@ -53,7 +43,7 @@ class BaseRenderEngine(object):
|
|||||||
"""
|
"""
|
||||||
raise NotImplementedError(f"version not implemented for {self.__class__.__name__}")
|
raise NotImplementedError(f"version not implemented for {self.__class__.__name__}")
|
||||||
|
|
||||||
def get_project_info(self, project_path: str, timeout: int = 10) -> Dict[str, Any]:
|
def get_project_info(self, project_path, timeout=10):
|
||||||
"""Extracts detailed project information from the given project path.
|
"""Extracts detailed project information from the given project path.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
@@ -69,7 +59,7 @@ class BaseRenderEngine(object):
|
|||||||
raise NotImplementedError(f"get_project_info not implemented for {self.__class__.__name__}")
|
raise NotImplementedError(f"get_project_info not implemented for {self.__class__.__name__}")
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def get_output_formats(cls) -> List[str]:
|
def get_output_formats(cls):
|
||||||
"""Returns a list of available output formats supported by the engine.
|
"""Returns a list of available output formats supported by the engine.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
@@ -78,22 +68,21 @@ class BaseRenderEngine(object):
|
|||||||
raise NotImplementedError(f"get_output_formats not implemented for {cls.__name__}")
|
raise NotImplementedError(f"get_output_formats not implemented for {cls.__name__}")
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def worker_class() -> Type[Any]: # override when subclassing to link worker class
|
def worker_class(): # override when subclassing to link worker class
|
||||||
raise NotImplementedError("Worker class not implemented")
|
raise NotImplementedError("Worker class not implemented")
|
||||||
|
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
# Optional Overrides for Subclasses:
|
# Optional Overrides for Subclasses:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def supported_extensions(self) -> List[str]:
|
def supported_extensions(self):
|
||||||
"""Return a list of file extensions supported by this engine.
|
"""
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
List[str]: List of supported file extensions (e.g., ['.blend', '.mp4']).
|
list[str]: list of supported extensions
|
||||||
"""
|
"""
|
||||||
return []
|
return []
|
||||||
|
|
||||||
def get_help(self) -> str:
|
def get_help(self):
|
||||||
"""Retrieves the help documentation for the engine.
|
"""Retrieves the help documentation for the engine.
|
||||||
|
|
||||||
This method runs the engine's help command (default: '-h') and captures the output.
|
This method runs the engine's help command (default: '-h') and captures the output.
|
||||||
@@ -113,7 +102,7 @@ class BaseRenderEngine(object):
|
|||||||
timeout=SUBPROCESS_TIMEOUT, creationflags=creationflags).decode('utf-8')
|
timeout=SUBPROCESS_TIMEOUT, creationflags=creationflags).decode('utf-8')
|
||||||
return help_doc
|
return help_doc
|
||||||
|
|
||||||
def system_info(self) -> Dict[str, Any]:
|
def system_info(self):
|
||||||
"""Return additional information about the system specfic to the engine (configured GPUs, render engines, etc)
|
"""Return additional information about the system specfic to the engine (configured GPUs, render engines, etc)
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
@@ -121,7 +110,7 @@ class BaseRenderEngine(object):
|
|||||||
"""
|
"""
|
||||||
return {}
|
return {}
|
||||||
|
|
||||||
def perform_presubmission_tasks(self, project_path: str) -> str:
|
def perform_presubmission_tasks(self, project_path):
|
||||||
"""Perform any pre-submission tasks on a project file before uploading it to a server (pack textures, etc.)
|
"""Perform any pre-submission tasks on a project file before uploading it to a server (pack textures, etc.)
|
||||||
|
|
||||||
Override this method to:
|
Override this method to:
|
||||||
@@ -137,60 +126,31 @@ class BaseRenderEngine(object):
|
|||||||
"""
|
"""
|
||||||
return project_path
|
return project_path
|
||||||
|
|
||||||
def get_arguments(self) -> Dict[str, Any]:
|
def get_arguments(self):
|
||||||
"""Return command-line arguments for this engine.
|
pass
|
||||||
|
|
||||||
Returns:
|
|
||||||
Dict[str, Any]: Dictionary of argument specifications.
|
|
||||||
"""
|
|
||||||
return {}
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def downloader() -> Optional[Any]:
|
def downloader(): # override when subclassing if using a downloader class
|
||||||
"""Return the downloader class for this engine.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Optional[Any]: Downloader class instance, or None if no downloader is used.
|
|
||||||
"""
|
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def ui_options(self) -> Dict[str, Any]:
|
@staticmethod
|
||||||
"""Return UI configuration options for this engine.
|
def ui_options(system_info): # override to return options for ui
|
||||||
|
|
||||||
Returns:
|
|
||||||
Dict[str, Any]: Dictionary of UI options and their configurations.
|
|
||||||
"""
|
|
||||||
return {}
|
return {}
|
||||||
|
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
# Do Not Override These Methods:
|
# Do Not Override These Methods:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def engine_path(self) -> Optional[str]:
|
def engine_path(self):
|
||||||
"""Return the path to the engine executable.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Optional[str]: Path to the engine executable, or None if not found.
|
|
||||||
"""
|
|
||||||
return self.custom_engine_path or self.default_engine_path()
|
return self.custom_engine_path or self.default_engine_path()
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def name(cls) -> str:
|
def name(cls):
|
||||||
"""Return the name of this engine.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: Engine name in lowercase.
|
|
||||||
"""
|
|
||||||
return str(cls.__name__).lower()
|
return str(cls.__name__).lower()
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def default_engine_path(cls) -> Optional[str]:
|
def default_engine_path(cls):
|
||||||
"""Find the default path to the engine executable.
|
path = None
|
||||||
|
|
||||||
Returns:
|
|
||||||
Optional[str]: Default path to the engine executable, or None if not found.
|
|
||||||
"""
|
|
||||||
path: Optional[str] = None
|
|
||||||
try: # Linux and macOS
|
try: # Linux and macOS
|
||||||
path = subprocess.check_output(['which', cls.name()], timeout=SUBPROCESS_TIMEOUT).decode('utf-8').strip()
|
path = subprocess.check_output(['which', cls.name()], timeout=SUBPROCESS_TIMEOUT).decode('utf-8').strip()
|
||||||
except (subprocess.CalledProcessError, FileNotFoundError):
|
except (subprocess.CalledProcessError, FileNotFoundError):
|
||||||
|
|||||||
@@ -8,7 +8,6 @@ import subprocess
|
|||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from typing import Optional, Dict, Any, List, Union
|
|
||||||
|
|
||||||
import psutil
|
import psutil
|
||||||
from pubsub import pub
|
from pubsub import pub
|
||||||
@@ -50,25 +49,9 @@ class BaseRenderWorker(Base):
|
|||||||
# Required Overrides for Subclasses:
|
# Required Overrides for Subclasses:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def __init__(self, input_path: str, output_path: str, engine_path: str, priority: int = 2,
|
def __init__(self, input_path, output_path, engine_path, priority=2, args=None, ignore_extensions=True, parent=None,
|
||||||
args: Optional[Dict[str, Any]] = None, ignore_extensions: bool = True,
|
name=None):
|
||||||
parent: Optional[str] = None, name: Optional[str] = None) -> None:
|
|
||||||
"""Initialize a render worker.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
input_path: Path to the input project file.
|
|
||||||
output_path: Path where output files will be saved.
|
|
||||||
engine_path: Path to the render engine executable.
|
|
||||||
priority: Job priority level (default: 2).
|
|
||||||
args: Additional arguments for the render job.
|
|
||||||
ignore_extensions: Whether to ignore file extension validation.
|
|
||||||
parent: Parent job ID for distributed jobs.
|
|
||||||
name: Custom name for the job.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If file extension is not supported.
|
|
||||||
NotImplementedError: If engine is not defined.
|
|
||||||
"""
|
|
||||||
if not ignore_extensions:
|
if not ignore_extensions:
|
||||||
if not any(ext in input_path for ext in self.engine.supported_extensions()):
|
if not any(ext in input_path for ext in self.engine.supported_extensions()):
|
||||||
err_meg = f"Cannot find valid project with supported file extension for '{self.engine.name()}'"
|
err_meg = f"Cannot find valid project with supported file extension for '{self.engine.name()}'"
|
||||||
@@ -78,7 +61,6 @@ class BaseRenderWorker(Base):
|
|||||||
raise NotImplementedError(f"Engine not defined for {self.__class__.__name__}")
|
raise NotImplementedError(f"Engine not defined for {self.__class__.__name__}")
|
||||||
|
|
||||||
def generate_id():
|
def generate_id():
|
||||||
"""Generate a unique job ID."""
|
|
||||||
import uuid
|
import uuid
|
||||||
return str(uuid.uuid4()).split('-')[0]
|
return str(uuid.uuid4()).split('-')[0]
|
||||||
|
|
||||||
@@ -121,15 +103,15 @@ class BaseRenderWorker(Base):
|
|||||||
self.__last_output_time = None
|
self.__last_output_time = None
|
||||||
self.watchdog_timeout = 120
|
self.watchdog_timeout = 120
|
||||||
|
|
||||||
def generate_worker_subprocess(self) -> List[str]:
|
def generate_worker_subprocess(self):
|
||||||
"""Generate a return a list of the command line arguments necessary to perform requested job
|
"""Generate a return a list of the command line arguments necessary to perform requested job
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
list[str]: list of command line arguments
|
list[str]: list of command line arguments
|
||||||
"""
|
"""
|
||||||
raise NotImplementedError("generate_worker_subprocess not implemented")
|
raise NotImplementedError("generate_worker_subprocess not implemented")
|
||||||
|
|
||||||
def _parse_stdout(self, line: str) -> None:
|
def _parse_stdout(self, line):
|
||||||
"""Parses a line of standard output from the engine.
|
"""Parses a line of standard output from the engine.
|
||||||
|
|
||||||
This method should be overridden in a subclass to implement the logic for processing
|
This method should be overridden in a subclass to implement the logic for processing
|
||||||
@@ -151,18 +133,13 @@ class BaseRenderWorker(Base):
|
|||||||
# Optional Overrides for Subclasses:
|
# Optional Overrides for Subclasses:
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def percent_complete(self) -> float:
|
def percent_complete(self):
|
||||||
"""Return the completion percentage of this job.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
float: Completion percentage between 0.0 and 1.0.
|
|
||||||
"""
|
|
||||||
# todo: fix this
|
# todo: fix this
|
||||||
if self.status == RenderStatus.COMPLETED:
|
if self.status == RenderStatus.COMPLETED:
|
||||||
return 1.0
|
return 1.0
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
def post_processing(self) -> None:
|
def post_processing(self):
|
||||||
"""Override to perform any engine-specific postprocessing"""
|
"""Override to perform any engine-specific postprocessing"""
|
||||||
pass
|
pass
|
||||||
|
|
||||||
@@ -171,11 +148,6 @@ class BaseRenderWorker(Base):
|
|||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
"""Return string representation of the worker.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: String representation showing job details.
|
|
||||||
"""
|
|
||||||
return f"<Job id:{self.id} p{self.priority} {self.engine_name}-{self.engine_version} '{self.name}' status:{self.status.value}>"
|
return f"<Job id:{self.id} p{self.priority} {self.engine_name}-{self.engine_version} '{self.name}' status:{self.status.value}>"
|
||||||
|
|
||||||
@property
|
@property
|
||||||
@@ -204,14 +176,6 @@ class BaseRenderWorker(Base):
|
|||||||
pub.sendMessage('frame_complete', job_id=self.id, frame_number=self.current_frame)
|
pub.sendMessage('frame_complete', job_id=self.id, frame_number=self.current_frame)
|
||||||
|
|
||||||
def generate_subprocess(self):
|
def generate_subprocess(self):
|
||||||
"""Generate subprocess command arguments.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List[str]: List of command line arguments for the subprocess.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
ValueError: If argument conflicts are detected.
|
|
||||||
"""
|
|
||||||
# Convert raw args from string if available and catch conflicts
|
# Convert raw args from string if available and catch conflicts
|
||||||
generated_args = [str(x) for x in self.generate_worker_subprocess()]
|
generated_args = [str(x) for x in self.generate_worker_subprocess()]
|
||||||
generated_args_flags = [x for x in generated_args if x.startswith('-')]
|
generated_args_flags = [x for x in generated_args if x.startswith('-')]
|
||||||
@@ -222,11 +186,6 @@ class BaseRenderWorker(Base):
|
|||||||
return generated_args
|
return generated_args
|
||||||
|
|
||||||
def get_raw_args(self):
|
def get_raw_args(self):
|
||||||
"""Parse raw command line arguments from args dictionary.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Optional[List[str]]: Parsed raw arguments, or None if no raw args.
|
|
||||||
"""
|
|
||||||
raw_args_string = self.args.get('raw', '')
|
raw_args_string = self.args.get('raw', '')
|
||||||
raw_args = None
|
raw_args = None
|
||||||
if raw_args_string:
|
if raw_args_string:
|
||||||
@@ -235,20 +194,12 @@ class BaseRenderWorker(Base):
|
|||||||
return raw_args
|
return raw_args
|
||||||
|
|
||||||
def log_path(self):
|
def log_path(self):
|
||||||
"""Generate the log file path for this job.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: Path to the log file.
|
|
||||||
"""
|
|
||||||
filename = (self.name or os.path.basename(self.input_path)) + '_' + \
|
filename = (self.name or os.path.basename(self.input_path)) + '_' + \
|
||||||
self.date_created.strftime("%Y.%m.%d_%H.%M.%S") + '.log'
|
self.date_created.strftime("%Y.%m.%d_%H.%M.%S") + '.log'
|
||||||
return os.path.join(os.path.dirname(self.input_path), filename)
|
return os.path.join(os.path.dirname(self.input_path), filename)
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
"""Start the render job.
|
|
||||||
|
|
||||||
Validates input paths and engine availability, then starts the render thread.
|
|
||||||
"""
|
|
||||||
if self.status not in [RenderStatus.SCHEDULED, RenderStatus.NOT_STARTED, RenderStatus.CONFIGURING]:
|
if self.status not in [RenderStatus.SCHEDULED, RenderStatus.NOT_STARTED, RenderStatus.CONFIGURING]:
|
||||||
logger.error(f"Trying to start job with status: {self.status}")
|
logger.error(f"Trying to start job with status: {self.status}")
|
||||||
return
|
return
|
||||||
@@ -478,33 +429,17 @@ class BaseRenderWorker(Base):
|
|||||||
logger.error(f"Error stopping the process: {e}")
|
logger.error(f"Error stopping the process: {e}")
|
||||||
|
|
||||||
def is_running(self):
|
def is_running(self):
|
||||||
"""Check if the render job is currently running.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
bool: True if the job is running, False otherwise.
|
|
||||||
"""
|
|
||||||
if hasattr(self, '__thread'):
|
if hasattr(self, '__thread'):
|
||||||
return self.__thread.is_alive()
|
return self.__thread.is_alive()
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def log_error(self, error_line, halt_render=False):
|
def log_error(self, error_line, halt_render=False):
|
||||||
"""Log an error and optionally halt the render.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
error_line: Error message to log.
|
|
||||||
halt_render: Whether to stop the render due to this error.
|
|
||||||
"""
|
|
||||||
logger.error(error_line)
|
logger.error(error_line)
|
||||||
self.errors.append(error_line)
|
self.errors.append(error_line)
|
||||||
if halt_render:
|
if halt_render:
|
||||||
self.stop(is_error=True)
|
self.stop(is_error=True)
|
||||||
|
|
||||||
def stop(self, is_error=False):
|
def stop(self, is_error=False):
|
||||||
"""Stop the render job.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
is_error: Whether this stop is due to an error.
|
|
||||||
"""
|
|
||||||
logger.debug(f"Stopping {self}")
|
logger.debug(f"Stopping {self}")
|
||||||
|
|
||||||
# cleanup status
|
# cleanup status
|
||||||
@@ -522,19 +457,9 @@ class BaseRenderWorker(Base):
|
|||||||
self.__thread.join(timeout=5)
|
self.__thread.join(timeout=5)
|
||||||
|
|
||||||
def time_elapsed(self):
|
def time_elapsed(self):
|
||||||
"""Get elapsed time for this job.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
str: Formatted time elapsed string.
|
|
||||||
"""
|
|
||||||
return get_time_elapsed(self.start_time, self.end_time)
|
return get_time_elapsed(self.start_time, self.end_time)
|
||||||
|
|
||||||
def file_list(self):
|
def file_list(self):
|
||||||
"""Get list of output files for this job.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List[str]: List of output file paths.
|
|
||||||
"""
|
|
||||||
try:
|
try:
|
||||||
job_dir = os.path.dirname(self.output_path)
|
job_dir = os.path.dirname(self.output_path)
|
||||||
file_list = [
|
file_list = [
|
||||||
@@ -548,11 +473,6 @@ class BaseRenderWorker(Base):
|
|||||||
return []
|
return []
|
||||||
|
|
||||||
def json(self):
|
def json(self):
|
||||||
"""Convert worker to JSON-serializable dictionary.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Dict[str, Any]: Dictionary representation of worker data.
|
|
||||||
"""
|
|
||||||
job_dict = {
|
job_dict = {
|
||||||
'id': self.id,
|
'id': self.id,
|
||||||
'name': self.name,
|
'name': self.name,
|
||||||
@@ -582,10 +502,8 @@ class BaseRenderWorker(Base):
|
|||||||
|
|
||||||
# convert to json and back to auto-convert dates to iso format
|
# convert to json and back to auto-convert dates to iso format
|
||||||
def date_serializer(o):
|
def date_serializer(o):
|
||||||
"""Serialize datetime objects to ISO format."""
|
|
||||||
if isinstance(o, datetime):
|
if isinstance(o, datetime):
|
||||||
return o.isoformat()
|
return o.isoformat()
|
||||||
return None
|
|
||||||
|
|
||||||
json_convert = json.dumps(job_dict, default=date_serializer)
|
json_convert = json.dumps(job_dict, default=date_serializer)
|
||||||
worker_json = json.loads(json_convert)
|
worker_json = json.loads(json_convert)
|
||||||
@@ -593,15 +511,6 @@ class BaseRenderWorker(Base):
|
|||||||
|
|
||||||
|
|
||||||
def timecode_to_frames(timecode, frame_rate):
|
def timecode_to_frames(timecode, frame_rate):
|
||||||
"""Convert timecode string to frame number.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
timecode: Timecode in format HH:MM:SS:FF.
|
|
||||||
frame_rate: Frames per second.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
int: Frame number corresponding to timecode.
|
|
||||||
"""
|
|
||||||
e = [int(x) for x in timecode.split(':')]
|
e = [int(x) for x in timecode.split(':')]
|
||||||
seconds = (((e[0] * 60) + e[1] * 60) + e[2])
|
seconds = (((e[0] * 60) + e[1] * 60) + e[2])
|
||||||
frames = (seconds * frame_rate) + e[-1] + 1
|
frames = (seconds * frame_rate) + e[-1] + 1
|
||||||
|
|||||||
@@ -3,101 +3,58 @@ import os
|
|||||||
import shutil
|
import shutil
|
||||||
import threading
|
import threading
|
||||||
import concurrent.futures
|
import concurrent.futures
|
||||||
from pathlib import Path
|
|
||||||
from typing import Type, List, Dict, Any, Optional
|
|
||||||
|
|
||||||
from src.engines.core.base_engine import BaseRenderEngine
|
|
||||||
from src.engines.blender.blender_engine import Blender
|
from src.engines.blender.blender_engine import Blender
|
||||||
from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
|
from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
|
||||||
from src.utilities.misc_helper import current_system_os, current_system_cpu
|
from src.utilities.misc_helper import system_safe_path, current_system_os, current_system_cpu
|
||||||
|
|
||||||
logger = logging.getLogger()
|
logger = logging.getLogger()
|
||||||
|
|
||||||
|
|
||||||
ENGINE_CLASSES = [Blender, FFMPEG]
|
ENGINE_CLASSES = [Blender, FFMPEG]
|
||||||
|
|
||||||
|
|
||||||
class EngineManager:
|
class EngineManager:
|
||||||
"""Class that manages different versions of installed render engines and handles fetching and downloading new versions,
|
"""Class that manages different versions of installed render engines and handles fetching and downloading new versions,
|
||||||
if possible.
|
if possible.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
engines_path: Optional[str] = None
|
engines_path = None
|
||||||
download_tasks: List[Any] = []
|
download_tasks = []
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def supported_engines() -> list[type[BaseRenderEngine]]:
|
def supported_engines():
|
||||||
"""Return list of supported engine classes.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List[Type[BaseRenderEngine]]: List of available engine classes.
|
|
||||||
"""
|
|
||||||
return ENGINE_CLASSES
|
return ENGINE_CLASSES
|
||||||
|
|
||||||
# --- Installed Engines ---
|
# --- Installed Engines ---
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def engine_class_for_project_path(cls, path: str) -> Type[BaseRenderEngine]:
|
def engine_for_project_path(cls, path):
|
||||||
"""Find engine class that can handle the given project file.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
path: Path to project file.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Type[BaseRenderEngine]: Engine class that can handle the file.
|
|
||||||
"""
|
|
||||||
_, extension = os.path.splitext(path)
|
_, extension = os.path.splitext(path)
|
||||||
extension = extension.lower().strip('.')
|
extension = extension.lower().strip('.')
|
||||||
for engine_class in cls.supported_engines():
|
for engine_class in cls.supported_engines():
|
||||||
engine = cls.get_latest_engine_instance(engine_class)
|
engine = cls.get_latest_engine_instance(engine_class)
|
||||||
if extension in engine.supported_extensions():
|
if extension in engine.supported_extensions():
|
||||||
return engine_class
|
return engine
|
||||||
undefined_renderer_support = [x for x in cls.supported_engines() if not cls.get_latest_engine_instance(x).supported_extensions()]
|
undefined_renderer_support = [x for x in cls.supported_engines() if not cls.get_latest_engine_instance(x).supported_extensions()]
|
||||||
return undefined_renderer_support[0]
|
return undefined_renderer_support[0]
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def engine_class_with_name(cls, engine_name: str) -> Optional[Type[BaseRenderEngine]]:
|
def engine_with_name(cls, engine_name):
|
||||||
"""Find engine class by name.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine_name: Name of engine to find.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Optional[Type[BaseRenderEngine]]: Engine class if found, None otherwise.
|
|
||||||
"""
|
|
||||||
for obj in cls.supported_engines():
|
for obj in cls.supported_engines():
|
||||||
if obj.name().lower() == engine_name.lower():
|
if obj.name().lower() == engine_name.lower():
|
||||||
return obj
|
return obj
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def get_latest_engine_instance(cls, engine_class: Type[BaseRenderEngine]) -> BaseRenderEngine:
|
def get_latest_engine_instance(cls, engine_class):
|
||||||
"""Create instance of latest installed engine version.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine_class: Engine class to instantiate.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
BaseRenderEngine: Instance of engine with latest version.
|
|
||||||
"""
|
|
||||||
newest = cls.newest_installed_engine_data(engine_class.name())
|
newest = cls.newest_installed_engine_data(engine_class.name())
|
||||||
engine = engine_class(newest["path"])
|
engine = engine_class(newest["path"])
|
||||||
return engine
|
return engine
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def get_installed_engine_data(cls, filter_name: Optional[str] = None, include_corrupt: bool = False,
|
def get_installed_engine_data(cls, filter_name=None, include_corrupt=False, ignore_system=False):
|
||||||
ignore_system: bool = False) -> List[Dict[str, Any]]:
|
|
||||||
"""Get data about installed render engines.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
filter_name: Optional engine name to filter by.
|
|
||||||
include_corrupt: Whether to include potentially corrupted installations.
|
|
||||||
ignore_system: Whether to ignore system-installed engines.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List[Dict[str, Any]]: List of installed engine data.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
FileNotFoundError: If engines path is not set.
|
|
||||||
"""
|
|
||||||
if not cls.engines_path:
|
if not cls.engines_path:
|
||||||
raise FileNotFoundError("Engine path is not set")
|
raise FileNotFoundError("Engine path is not set")
|
||||||
|
|
||||||
@@ -118,13 +75,15 @@ class EngineManager:
|
|||||||
# Initialize binary_name with engine name
|
# Initialize binary_name with engine name
|
||||||
binary_name = result_dict['engine'].lower()
|
binary_name = result_dict['engine'].lower()
|
||||||
# Determine the correct binary name based on the engine and system_os
|
# Determine the correct binary name based on the engine and system_os
|
||||||
eng = cls.engine_class_with_name(result_dict['engine'])
|
eng = cls.engine_with_name(result_dict['engine'])
|
||||||
binary_name = eng.binary_names.get(result_dict['system_os'], binary_name)
|
binary_name = eng.binary_names.get(result_dict['system_os'], binary_name)
|
||||||
|
|
||||||
# Find the path to the binary file
|
# Find the path to the binary file
|
||||||
search_root = Path(cls.engines_path) / directory
|
path = next(
|
||||||
match = next((p for p in search_root.rglob(binary_name) if p.is_file()), None)
|
(os.path.join(root, binary_name) for root, _, files in
|
||||||
path = str(match) if match else None
|
os.walk(system_safe_path(os.path.join(cls.engines_path, directory))) if binary_name in files),
|
||||||
|
None
|
||||||
|
)
|
||||||
result_dict['path'] = path
|
result_dict['path'] = path
|
||||||
|
|
||||||
# fetch version number from binary - helps detect corrupted downloads - disabled due to perf issues
|
# fetch version number from binary - helps detect corrupted downloads - disabled due to perf issues
|
||||||
@@ -174,8 +133,7 @@ class EngineManager:
|
|||||||
# --- Check for Updates ---
|
# --- Check for Updates ---
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def update_all_engines(cls) -> None:
|
def update_all_engines(cls):
|
||||||
"""Check for and download updates for all downloadable engines."""
|
|
||||||
for engine in cls.downloadable_engines():
|
for engine in cls.downloadable_engines():
|
||||||
update_available = cls.is_engine_update_available(engine)
|
update_available = cls.is_engine_update_available(engine)
|
||||||
if update_available:
|
if update_available:
|
||||||
@@ -183,34 +141,13 @@ class EngineManager:
|
|||||||
cls.download_engine(engine.name(), update_available['version'], background=True)
|
cls.download_engine(engine.name(), update_available['version'], background=True)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def all_version_data_for_engine(cls, engine_name:str, include_corrupt=False, ignore_system=False) -> list:
|
def all_version_data_for_engine(cls, engine_name, include_corrupt=False, ignore_system=False):
|
||||||
"""Get all version data for a specific engine.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine_name: Name of engine to query.
|
|
||||||
include_corrupt: Whether to include corrupt installations.
|
|
||||||
ignore_system: Whether to ignore system installations.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
list: Sorted list of engine version data (newest first).
|
|
||||||
"""
|
|
||||||
versions = cls.get_installed_engine_data(filter_name=engine_name, include_corrupt=include_corrupt, ignore_system=ignore_system)
|
versions = cls.get_installed_engine_data(filter_name=engine_name, include_corrupt=include_corrupt, ignore_system=ignore_system)
|
||||||
sorted_versions = sorted(versions, key=lambda x: x['version'], reverse=True)
|
sorted_versions = sorted(versions, key=lambda x: x['version'], reverse=True)
|
||||||
return sorted_versions
|
return sorted_versions
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def newest_installed_engine_data(cls, engine_name:str, system_os=None, cpu=None, ignore_system=None) -> list:
|
def newest_installed_engine_data(cls, engine_name, system_os=None, cpu=None, ignore_system=None):
|
||||||
"""Get newest installed engine data for specific platform.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine_name: Name of engine to query.
|
|
||||||
system_os: Operating system to filter by (defaults to current).
|
|
||||||
cpu: CPU architecture to filter by (defaults to current).
|
|
||||||
ignore_system: Whether to ignore system installations.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
list: Newest engine data or empty list if not found.
|
|
||||||
"""
|
|
||||||
system_os = system_os or current_system_os()
|
system_os = system_os or current_system_os()
|
||||||
cpu = cpu or current_system_cpu()
|
cpu = cpu or current_system_cpu()
|
||||||
|
|
||||||
@@ -220,49 +157,37 @@ class EngineManager:
|
|||||||
return filtered[0]
|
return filtered[0]
|
||||||
except IndexError:
|
except IndexError:
|
||||||
logger.error(f"Cannot find newest engine version for {engine_name}-{system_os}-{cpu}")
|
logger.error(f"Cannot find newest engine version for {engine_name}-{system_os}-{cpu}")
|
||||||
return []
|
return None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def is_version_installed(cls, engine_name:str, version:str, system_os=None, cpu=None, ignore_system=False):
|
def is_version_installed(cls, engine, version, system_os=None, cpu=None, ignore_system=False):
|
||||||
"""Check if specific engine version is installed.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine_name: Name of engine to check.
|
|
||||||
version: Version string to check.
|
|
||||||
system_os: Operating system to check (defaults to current).
|
|
||||||
cpu: CPU architecture to check (defaults to current).
|
|
||||||
ignore_system: Whether to ignore system installations.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Engine data if found, False otherwise.
|
|
||||||
"""
|
|
||||||
system_os = system_os or current_system_os()
|
system_os = system_os or current_system_os()
|
||||||
cpu = cpu or current_system_cpu()
|
cpu = cpu or current_system_cpu()
|
||||||
|
|
||||||
filtered = [x for x in cls.get_installed_engine_data(filter_name=engine_name, ignore_system=ignore_system) if
|
filtered = [x for x in cls.get_installed_engine_data(filter_name=engine, ignore_system=ignore_system) if
|
||||||
x['system_os'] == system_os and x['cpu'] == cpu and x['version'] == version]
|
x['system_os'] == system_os and x['cpu'] == cpu and x['version'] == version]
|
||||||
return filtered[0] if filtered else False
|
return filtered[0] if filtered else False
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def version_is_available_to_download(cls, engine_name:str, version, system_os=None, cpu=None):
|
def version_is_available_to_download(cls, engine, version, system_os=None, cpu=None):
|
||||||
try:
|
try:
|
||||||
downloader = cls.engine_class_with_name(engine_name).downloader()
|
downloader = cls.engine_with_name(engine).downloader()
|
||||||
return downloader.version_is_available_to_download(version=version, system_os=system_os, cpu=cpu)
|
return downloader.version_is_available_to_download(version=version, system_os=system_os, cpu=cpu)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.debug(f"Exception in version_is_available_to_download: {e}")
|
logger.debug(f"Exception in version_is_available_to_download: {e}")
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def find_most_recent_version(cls, engine_name:str, system_os=None, cpu=None, lts_only=False) -> dict:
|
def find_most_recent_version(cls, engine=None, system_os=None, cpu=None, lts_only=False):
|
||||||
try:
|
try:
|
||||||
downloader = cls.engine_class_with_name(engine_name).downloader()
|
downloader = cls.engine_with_name(engine).downloader()
|
||||||
return downloader.find_most_recent_version(system_os=system_os, cpu=cpu)
|
return downloader.find_most_recent_version(system_os=system_os, cpu=cpu)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.debug(f"Exception in find_most_recent_version: {e}")
|
logger.debug(f"Exception in find_most_recent_version: {e}")
|
||||||
return {}
|
return None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def is_engine_update_available(cls, engine_class: Type[BaseRenderEngine], ignore_system_installs=False):
|
def is_engine_update_available(cls, engine_class, ignore_system_installs=False):
|
||||||
logger.debug(f"Checking for updates to {engine_class.name()}")
|
logger.debug(f"Checking for updates to {engine_class.name()}")
|
||||||
latest_version = engine_class.downloader().find_most_recent_version()
|
latest_version = engine_class.downloader().find_most_recent_version()
|
||||||
|
|
||||||
@@ -281,31 +206,26 @@ class EngineManager:
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def downloadable_engines(cls):
|
def downloadable_engines(cls):
|
||||||
"""Get list of engines that support downloading.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List[Type[BaseRenderEngine]]: Engines with downloader capability.
|
|
||||||
"""
|
|
||||||
return [engine for engine in cls.supported_engines() if hasattr(engine, "downloader") and engine.downloader()]
|
return [engine for engine in cls.supported_engines() if hasattr(engine, "downloader") and engine.downloader()]
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def get_existing_download_task(cls, engine_name, version, system_os=None, cpu=None):
|
def get_existing_download_task(cls, engine, version, system_os=None, cpu=None):
|
||||||
for task in cls.download_tasks:
|
for task in cls.download_tasks:
|
||||||
task_parts = task.name.split('-')
|
task_parts = task.name.split('-')
|
||||||
task_engine, task_version, task_system_os, task_cpu = task_parts[:4]
|
task_engine, task_version, task_system_os, task_cpu = task_parts[:4]
|
||||||
|
|
||||||
if engine_name == task_engine and version == task_version:
|
if engine == task_engine and version == task_version:
|
||||||
if system_os in (task_system_os, None) and cpu in (task_cpu, None):
|
if system_os in (task_system_os, None) and cpu in (task_cpu, None):
|
||||||
return task
|
return task
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def download_engine(cls, engine_name, version, system_os=None, cpu=None, background=False, ignore_system=False):
|
def download_engine(cls, engine, version, system_os=None, cpu=None, background=False, ignore_system=False):
|
||||||
|
|
||||||
engine_to_download = cls.engine_class_with_name(engine_name)
|
engine_to_download = cls.engine_with_name(engine)
|
||||||
existing_task = cls.get_existing_download_task(engine_name, version, system_os, cpu)
|
existing_task = cls.get_existing_download_task(engine, version, system_os, cpu)
|
||||||
if existing_task:
|
if existing_task:
|
||||||
logger.debug(f"Already downloading {engine_name} {version}")
|
logger.debug(f"Already downloading {engine} {version}")
|
||||||
if not background:
|
if not background:
|
||||||
existing_task.join() # If download task exists, wait until it's done downloading
|
existing_task.join() # If download task exists, wait until it's done downloading
|
||||||
return None
|
return None
|
||||||
@@ -315,7 +235,7 @@ class EngineManager:
|
|||||||
elif not cls.engines_path:
|
elif not cls.engines_path:
|
||||||
raise FileNotFoundError("Engines path must be set before requesting downloads")
|
raise FileNotFoundError("Engines path must be set before requesting downloads")
|
||||||
|
|
||||||
thread = EngineDownloadWorker(engine_name, version, system_os, cpu)
|
thread = EngineDownloadWorker(engine, version, system_os, cpu)
|
||||||
cls.download_tasks.append(thread)
|
cls.download_tasks.append(thread)
|
||||||
thread.start()
|
thread.start()
|
||||||
|
|
||||||
@@ -323,72 +243,41 @@ class EngineManager:
|
|||||||
return thread
|
return thread
|
||||||
|
|
||||||
thread.join()
|
thread.join()
|
||||||
found_engine = cls.is_version_installed(engine_name, version, system_os, cpu, ignore_system) # Check that engine downloaded
|
found_engine = cls.is_version_installed(engine, version, system_os, cpu, ignore_system) # Check that engine downloaded
|
||||||
if not found_engine:
|
if not found_engine:
|
||||||
logger.error(f"Error downloading {engine_name}")
|
logger.error(f"Error downloading {engine}")
|
||||||
return found_engine
|
return found_engine
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def delete_engine_download(cls, engine_name, version, system_os=None, cpu=None):
|
def delete_engine_download(cls, engine, version, system_os=None, cpu=None):
|
||||||
logger.info(f"Requested deletion of engine: {engine_name}-{version}")
|
logger.info(f"Requested deletion of engine: {engine}-{version}")
|
||||||
|
|
||||||
found = cls.is_version_installed(engine_name, version, system_os, cpu)
|
found = cls.is_version_installed(engine, version, system_os, cpu)
|
||||||
if found and found['type'] == 'managed': # don't delete system installs
|
if found and found['type'] == 'managed': # don't delete system installs
|
||||||
# find the root directory of the engine executable
|
# find the root directory of the engine executable
|
||||||
root_dir_name = '-'.join([engine_name, version, found['system_os'], found['cpu']])
|
root_dir_name = '-'.join([engine, version, found['system_os'], found['cpu']])
|
||||||
remove_path = os.path.join(found['path'].split(root_dir_name)[0], root_dir_name)
|
remove_path = os.path.join(found['path'].split(root_dir_name)[0], root_dir_name)
|
||||||
# delete the file path
|
# delete the file path
|
||||||
logger.info(f"Deleting engine at path: {remove_path}")
|
logger.info(f"Deleting engine at path: {remove_path}")
|
||||||
shutil.rmtree(remove_path, ignore_errors=False)
|
shutil.rmtree(remove_path, ignore_errors=False)
|
||||||
logger.info(f"Engine {engine_name}-{version}-{found['system_os']}-{found['cpu']} successfully deleted")
|
logger.info(f"Engine {engine}-{version}-{found['system_os']}-{found['cpu']} successfully deleted")
|
||||||
return True
|
return True
|
||||||
elif found: # these are managed by the system / user. Don't delete these.
|
elif found: # these are managed by the system / user. Don't delete these.
|
||||||
logger.error(f'Cannot delete requested {engine_name} {version}. Managed externally.')
|
logger.error(f'Cannot delete requested {engine} {version}. Managed externally.')
|
||||||
else:
|
else:
|
||||||
logger.error(f"Cannot find engine: {engine_name}-{version}")
|
logger.error(f"Cannot find engine: {engine}-{version}")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# --- Background Tasks ---
|
# --- Background Tasks ---
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def active_downloads(cls) -> list:
|
def active_downloads(cls) -> list:
|
||||||
"""Get list of currently active download tasks.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
list: List of active EngineDownloadWorker threads.
|
|
||||||
"""
|
|
||||||
return [x for x in cls.download_tasks if x.is_alive()]
|
return [x for x in cls.download_tasks if x.is_alive()]
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def create_worker(cls, engine_name: str, input_path: Path, output_path: Path, engine_version=None, args=None, parent=None, name=None):
|
def create_worker(cls, engine_name, input_path, output_path, engine_version=None, args=None, parent=None, name=None):
|
||||||
"""
|
|
||||||
Create and return a worker instance for a specific engine.
|
|
||||||
|
|
||||||
This resolves the appropriate engine binary/path for the requested engine and version,
|
worker_class = cls.engine_with_name(engine_name).worker_class()
|
||||||
downloading the engine if necessary (when a specific version is requested and not found
|
|
||||||
locally). The returned worker is constructed with string paths for compatibility with
|
|
||||||
worker implementations that expect `str` rather than `Path`.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine_name: The engine name used to resolve an engine class and its worker.
|
|
||||||
input_path: Path to the input file/folder for the worker to process.
|
|
||||||
output_path: Path where the worker should write output.
|
|
||||||
engine_version: Optional engine version to use. If `None` or `'latest'`, the newest
|
|
||||||
installed version is used. If a specific version is provided and not installed,
|
|
||||||
the engine will be downloaded.
|
|
||||||
args: Optional arguments passed through to the worker (engine-specific).
|
|
||||||
parent: Optional Qt/GUI parent object passed through to the worker constructor.
|
|
||||||
name: Optional name/label passed through to the worker constructor.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
An instance of the engine-specific worker class.
|
|
||||||
|
|
||||||
Raises:
|
|
||||||
FileNotFoundError: If no versions of the engine are installed, if the requested
|
|
||||||
version cannot be found or downloaded, or if the engine path cannot be resolved.
|
|
||||||
"""
|
|
||||||
|
|
||||||
worker_class = cls.engine_class_with_name(engine_name).worker_class()
|
|
||||||
|
|
||||||
# check to make sure we have versions installed
|
# check to make sure we have versions installed
|
||||||
all_versions = cls.all_version_data_for_engine(engine_name)
|
all_versions = cls.all_version_data_for_engine(engine_name)
|
||||||
@@ -417,7 +306,7 @@ class EngineManager:
|
|||||||
if not engine_path:
|
if not engine_path:
|
||||||
raise FileNotFoundError(f"Cannot find requested engine version {engine_version}")
|
raise FileNotFoundError(f"Cannot find requested engine version {engine_version}")
|
||||||
|
|
||||||
return worker_class(input_path=str(input_path), output_path=str(output_path), engine_path=engine_path, args=args,
|
return worker_class(input_path=input_path, output_path=output_path, engine_path=engine_path, args=args,
|
||||||
parent=parent, name=name)
|
parent=parent, name=name)
|
||||||
|
|
||||||
|
|
||||||
@@ -434,14 +323,6 @@ class EngineDownloadWorker(threading.Thread):
|
|||||||
cpu (str, optional): Requested CPU architecture. Defaults to system CPU type.
|
cpu (str, optional): Requested CPU architecture. Defaults to system CPU type.
|
||||||
"""
|
"""
|
||||||
def __init__(self, engine, version, system_os=None, cpu=None):
|
def __init__(self, engine, version, system_os=None, cpu=None):
|
||||||
"""Initialize download worker for specific engine version.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
engine: Name of engine to download.
|
|
||||||
version: Version of engine to download.
|
|
||||||
system_os: Target operating system (defaults to current).
|
|
||||||
cpu: Target CPU architecture (defaults to current).
|
|
||||||
"""
|
|
||||||
super().__init__()
|
super().__init__()
|
||||||
self.engine = engine
|
self.engine = engine
|
||||||
self.version = version
|
self.version = version
|
||||||
@@ -450,35 +331,25 @@ class EngineDownloadWorker(threading.Thread):
|
|||||||
self.percent_complete = 0
|
self.percent_complete = 0
|
||||||
|
|
||||||
def _update_progress(self, current_progress):
|
def _update_progress(self, current_progress):
|
||||||
"""Update download progress.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
current_progress: Current download progress percentage (0-100).
|
|
||||||
"""
|
|
||||||
self.percent_complete = current_progress
|
self.percent_complete = current_progress
|
||||||
|
|
||||||
def run(self):
|
def run(self):
|
||||||
"""Execute the download process.
|
try:
|
||||||
|
existing_download = EngineManager.is_version_installed(self.engine, self.version, self.system_os, self.cpu,
|
||||||
|
ignore_system=True)
|
||||||
|
if existing_download:
|
||||||
|
logger.info(f"Requested download of {self.engine} {self.version}, but local copy already exists")
|
||||||
|
return existing_download
|
||||||
|
|
||||||
Checks if engine version already exists, then downloads if not found.
|
# Get the appropriate downloader class based on the engine type
|
||||||
Handles cleanup and error reporting.
|
downloader = EngineManager.engine_with_name(self.engine).downloader()
|
||||||
"""
|
downloader.download_engine( self.version, download_location=EngineManager.engines_path,
|
||||||
try:
|
system_os=self.system_os, cpu=self.cpu, timeout=300, progress_callback=self._update_progress)
|
||||||
existing_download = EngineManager.is_version_installed(self.engine, self.version, self.system_os, self.cpu,
|
except Exception as e:
|
||||||
ignore_system=True)
|
logger.error(f"Error in download worker: {e}")
|
||||||
if existing_download:
|
finally:
|
||||||
logger.info(f"Requested download of {self.engine} {self.version}, but local copy already exists")
|
# remove itself from the downloader list
|
||||||
return existing_download
|
EngineManager.download_tasks.remove(self)
|
||||||
|
|
||||||
# Get the appropriate downloader class based on the engine type
|
|
||||||
downloader = EngineManager.engine_class_with_name(self.engine).downloader()
|
|
||||||
downloader.download_engine( self.version, download_location=EngineManager.engines_path,
|
|
||||||
system_os=self.system_os, cpu=self.cpu, timeout=300, progress_callback=self._update_progress)
|
|
||||||
except Exception as e:
|
|
||||||
logger.error(f"Error in download worker: {e}")
|
|
||||||
finally:
|
|
||||||
# remove itself from the downloader list
|
|
||||||
EngineManager.download_tasks.remove(self)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|||||||
@@ -20,8 +20,10 @@ class FFMPEG(BaseRenderEngine):
|
|||||||
from src.engines.ffmpeg.ffmpeg_worker import FFMPEGRenderWorker
|
from src.engines.ffmpeg.ffmpeg_worker import FFMPEGRenderWorker
|
||||||
return FFMPEGRenderWorker
|
return FFMPEGRenderWorker
|
||||||
|
|
||||||
def ui_options(self):
|
@staticmethod
|
||||||
return []
|
def ui_options(system_info):
|
||||||
|
from src.engines.ffmpeg.ffmpeg_ui import FFMPEGUI
|
||||||
|
return FFMPEGUI.get_options(system_info)
|
||||||
|
|
||||||
def supported_extensions(self):
|
def supported_extensions(self):
|
||||||
help_text = (subprocess.check_output([self.engine_path(), '-h', 'full'], stderr=subprocess.STDOUT,
|
help_text = (subprocess.check_output([self.engine_path(), '-h', 'full'], stderr=subprocess.STDOUT,
|
||||||
|
|||||||
5
src/engines/ffmpeg/ffmpeg_ui.py
Normal file
5
src/engines/ffmpeg/ffmpeg_ui.py
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
class FFMPEGUI:
|
||||||
|
@staticmethod
|
||||||
|
def get_options(system_info):
|
||||||
|
options = []
|
||||||
|
return options
|
||||||
@@ -1,36 +1,34 @@
|
|||||||
import logging
|
import logging
|
||||||
from collections import Counter
|
import os
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from pathlib import Path
|
|
||||||
from typing import List, Dict, Any, Optional
|
|
||||||
|
|
||||||
from pubsub import pub
|
from pubsub import pub
|
||||||
from sqlalchemy import create_engine
|
from sqlalchemy import create_engine
|
||||||
from sqlalchemy.orm import sessionmaker
|
from sqlalchemy.orm import sessionmaker
|
||||||
from sqlalchemy.orm.exc import DetachedInstanceError
|
from sqlalchemy.orm.exc import DetachedInstanceError
|
||||||
|
|
||||||
from src.engines.core.base_worker import Base, BaseRenderWorker
|
from src.engines.core.base_worker import Base
|
||||||
from src.utilities.status_utils import RenderStatus
|
from src.utilities.status_utils import RenderStatus
|
||||||
|
|
||||||
logger = logging.getLogger()
|
logger = logging.getLogger()
|
||||||
|
|
||||||
|
|
||||||
class JobNotFoundError(Exception):
|
class JobNotFoundError(Exception):
|
||||||
def __init__(self, job_id: str, *args: Any) -> None:
|
def __init__(self, job_id, *args):
|
||||||
super().__init__(args)
|
super().__init__(args)
|
||||||
self.job_id = job_id
|
self.job_id = job_id
|
||||||
|
|
||||||
def __str__(self) -> str:
|
def __str__(self):
|
||||||
return f"Cannot find job with ID: {self.job_id}"
|
return f"Cannot find job with ID: {self.job_id}"
|
||||||
|
|
||||||
|
|
||||||
class RenderQueue:
|
class RenderQueue:
|
||||||
engine: Optional[create_engine] = None
|
engine = None
|
||||||
session: Optional[sessionmaker] = None
|
session = None
|
||||||
job_queue: List[BaseRenderWorker] = []
|
job_queue = []
|
||||||
maximum_renderer_instances: Dict[str, int] = {'blender': 1, 'aerender': 1, 'ffmpeg': 4}
|
maximum_renderer_instances = {'blender': 1, 'aerender': 1, 'ffmpeg': 4}
|
||||||
last_saved_counts: Dict[str, int] = {}
|
last_saved_counts = {}
|
||||||
is_running: bool = False
|
is_running = False
|
||||||
|
|
||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
# Render Queue Evaluation:
|
# Render Queue Evaluation:
|
||||||
@@ -118,11 +116,12 @@ class RenderQueue:
|
|||||||
# --------------------------------------------
|
# --------------------------------------------
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def load_state(cls, database_directory: Path):
|
def load_state(cls, database_directory):
|
||||||
if not cls.engine:
|
if not cls.engine:
|
||||||
cls.engine = create_engine(f"sqlite:///{database_directory / 'database.db'}")
|
cls.engine = create_engine(f"sqlite:///{os.path.join(database_directory, 'database.db')}")
|
||||||
Base.metadata.create_all(cls.engine)
|
Base.metadata.create_all(cls.engine)
|
||||||
cls.session = sessionmaker(bind=cls.engine)()
|
cls.session = sessionmaker(bind=cls.engine)()
|
||||||
|
from src.engines.core.base_worker import BaseRenderWorker
|
||||||
cls.job_queue = cls.session.query(BaseRenderWorker).all()
|
cls.job_queue = cls.session.query(BaseRenderWorker).all()
|
||||||
pub.subscribe(cls.__local_job_status_changed, 'status_change')
|
pub.subscribe(cls.__local_job_status_changed, 'status_change')
|
||||||
|
|
||||||
@@ -135,7 +134,7 @@ class RenderQueue:
|
|||||||
logger.debug("Closing session")
|
logger.debug("Closing session")
|
||||||
cls.stop()
|
cls.stop()
|
||||||
running_jobs = cls.jobs_with_status(RenderStatus.RUNNING) # cancel all running jobs
|
running_jobs = cls.jobs_with_status(RenderStatus.RUNNING) # cancel all running jobs
|
||||||
_ = [cls.cancel_job(job) for job in running_jobs]
|
[cls.cancel_job(job) for job in running_jobs]
|
||||||
cls.save_state()
|
cls.save_state()
|
||||||
cls.session.close()
|
cls.session.close()
|
||||||
|
|
||||||
@@ -145,6 +144,7 @@ class RenderQueue:
|
|||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def renderer_instances(cls):
|
def renderer_instances(cls):
|
||||||
|
from collections import Counter
|
||||||
all_instances = [x.engine_name for x in cls.running_jobs()]
|
all_instances = [x.engine_name for x in cls.running_jobs()]
|
||||||
return Counter(all_instances)
|
return Counter(all_instances)
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
|
import os.path
|
||||||
import socket
|
import socket
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
import psutil
|
import psutil
|
||||||
from PyQt6.QtCore import QThread, pyqtSignal, Qt, pyqtSlot
|
from PyQt6.QtCore import QThread, pyqtSignal, Qt, pyqtSlot
|
||||||
@@ -12,8 +12,9 @@ from PyQt6.QtWidgets import (
|
|||||||
from src.api.server_proxy import RenderServerProxy
|
from src.api.server_proxy import RenderServerProxy
|
||||||
from src.engines.engine_manager import EngineManager
|
from src.engines.engine_manager import EngineManager
|
||||||
from src.ui.engine_help_window import EngineHelpViewer
|
from src.ui.engine_help_window import EngineHelpViewer
|
||||||
from src.utilities.misc_helper import COMMON_RESOLUTIONS, COMMON_FRAME_RATES
|
|
||||||
from src.utilities.zeroconf_server import ZeroconfServer
|
from src.utilities.zeroconf_server import ZeroconfServer
|
||||||
|
from src.utilities.misc_helper import COMMON_RESOLUTIONS
|
||||||
|
from utilities.misc_helper import COMMON_FRAME_RATES
|
||||||
|
|
||||||
|
|
||||||
class NewRenderJobForm(QWidget):
|
class NewRenderJobForm(QWidget):
|
||||||
@@ -62,15 +63,19 @@ class NewRenderJobForm(QWidget):
|
|||||||
|
|
||||||
# Job / Server Data
|
# Job / Server Data
|
||||||
self.server_proxy = RenderServerProxy(socket.gethostname())
|
self.server_proxy = RenderServerProxy(socket.gethostname())
|
||||||
|
self.engine_info = None
|
||||||
self.project_info = None
|
self.project_info = None
|
||||||
self.installed_engines = {}
|
|
||||||
self.preferred_engine = None
|
|
||||||
|
|
||||||
# Setup
|
# Setup
|
||||||
self.setWindowTitle("New Job")
|
self.setWindowTitle("New Job")
|
||||||
self.setup_ui()
|
self.setup_ui()
|
||||||
|
self.update_engine_info()
|
||||||
self.setup_project()
|
self.setup_project()
|
||||||
|
|
||||||
|
# get renderer info in bg thread
|
||||||
|
# t = threading.Thread(target=self.update_renderer_info)
|
||||||
|
# t.start()
|
||||||
|
|
||||||
self.show()
|
self.show()
|
||||||
|
|
||||||
def setup_ui(self):
|
def setup_ui(self):
|
||||||
@@ -102,8 +107,6 @@ class NewRenderJobForm(QWidget):
|
|||||||
job_name_layout.addWidget(QLabel("Job name:"))
|
job_name_layout.addWidget(QLabel("Job name:"))
|
||||||
self.job_name_input = QLineEdit()
|
self.job_name_input = QLineEdit()
|
||||||
job_name_layout.addWidget(self.job_name_input)
|
job_name_layout.addWidget(self.job_name_input)
|
||||||
self.engine_type = QComboBox()
|
|
||||||
job_name_layout.addWidget(self.engine_type)
|
|
||||||
file_group_layout.addLayout(job_name_layout)
|
file_group_layout.addLayout(job_name_layout)
|
||||||
|
|
||||||
# Job File
|
# Job File
|
||||||
@@ -190,7 +193,7 @@ class NewRenderJobForm(QWidget):
|
|||||||
resolution_group.setLayout(resolution_group_layout)
|
resolution_group.setLayout(resolution_group_layout)
|
||||||
|
|
||||||
# Resolution
|
# Resolution
|
||||||
resolution_layout = QHBoxLayout()
|
resolution_layout = QHBoxLayout(resolution_group)
|
||||||
self.resolution_options_list = QComboBox()
|
self.resolution_options_list = QComboBox()
|
||||||
self.resolution_options_list.setFixedWidth(200)
|
self.resolution_options_list.setFixedWidth(200)
|
||||||
self.resolution_options_list.addItem("Original Size")
|
self.resolution_options_list.addItem("Original Size")
|
||||||
@@ -212,7 +215,7 @@ class NewRenderJobForm(QWidget):
|
|||||||
resolution_layout.addWidget(self.resolution_y_input)
|
resolution_layout.addWidget(self.resolution_y_input)
|
||||||
resolution_layout.addStretch()
|
resolution_layout.addStretch()
|
||||||
|
|
||||||
fps_layout = QHBoxLayout()
|
fps_layout = QHBoxLayout(resolution_group)
|
||||||
self.fps_options_list = QComboBox()
|
self.fps_options_list = QComboBox()
|
||||||
self.fps_options_list.setFixedWidth(200)
|
self.fps_options_list.setFixedWidth(200)
|
||||||
self.fps_options_list.addItem("Original FPS")
|
self.fps_options_list.addItem("Original FPS")
|
||||||
@@ -239,7 +242,12 @@ class NewRenderJobForm(QWidget):
|
|||||||
engine_group_layout = QVBoxLayout(self.engine_group)
|
engine_group_layout = QVBoxLayout(self.engine_group)
|
||||||
|
|
||||||
engine_layout = QHBoxLayout()
|
engine_layout = QHBoxLayout()
|
||||||
engine_layout.addWidget(QLabel("Engine Version:"))
|
engine_layout.addWidget(QLabel("Engine:"))
|
||||||
|
self.engine_type = QComboBox()
|
||||||
|
self.engine_type.currentIndexChanged.connect(self.engine_changed)
|
||||||
|
engine_layout.addWidget(self.engine_type)
|
||||||
|
|
||||||
|
engine_layout.addWidget(QLabel("Version:"))
|
||||||
self.engine_version_combo = QComboBox()
|
self.engine_version_combo = QComboBox()
|
||||||
self.engine_version_combo.addItem('latest')
|
self.engine_version_combo.addItem('latest')
|
||||||
engine_layout.addWidget(self.engine_version_combo)
|
engine_layout.addWidget(self.engine_version_combo)
|
||||||
@@ -264,7 +272,6 @@ class NewRenderJobForm(QWidget):
|
|||||||
self.cameras_group = QWidget()
|
self.cameras_group = QWidget()
|
||||||
cameras_layout = QVBoxLayout(self.cameras_group)
|
cameras_layout = QVBoxLayout(self.cameras_group)
|
||||||
self.cameras_list = QListWidget()
|
self.cameras_list = QListWidget()
|
||||||
self.cameras_list.itemChanged.connect(self.update_job_count)
|
|
||||||
cameras_layout.addWidget(self.cameras_list)
|
cameras_layout.addWidget(self.cameras_list)
|
||||||
|
|
||||||
# ==================== Tab 5: Misc / Notes ====================
|
# ==================== Tab 5: Misc / Notes ====================
|
||||||
@@ -304,21 +311,6 @@ class NewRenderJobForm(QWidget):
|
|||||||
self.toggle_engine_enablement(False)
|
self.toggle_engine_enablement(False)
|
||||||
self.tabs.setCurrentIndex(0)
|
self.tabs.setCurrentIndex(0)
|
||||||
|
|
||||||
def update_job_count(self, changed_item=None):
|
|
||||||
checked = 1
|
|
||||||
if self.cameras_group.enabled:
|
|
||||||
checked = 0
|
|
||||||
total = self.cameras_list.count()
|
|
||||||
|
|
||||||
for i in range(total):
|
|
||||||
item = self.cameras_list.item(i)
|
|
||||||
if item.checkState() == Qt.CheckState.Checked:
|
|
||||||
checked += 1
|
|
||||||
|
|
||||||
message = f"Submit {checked} Jobs" if checked > 1 else "Submit Job"
|
|
||||||
self.submit_button.setText(message)
|
|
||||||
self.submit_button.setEnabled(bool(checked))
|
|
||||||
|
|
||||||
def _resolution_preset_changed(self, index):
|
def _resolution_preset_changed(self, index):
|
||||||
selected_res = COMMON_RESOLUTIONS.get(self.resolution_options_list.currentText())
|
selected_res = COMMON_RESOLUTIONS.get(self.resolution_options_list.currentText())
|
||||||
if selected_res:
|
if selected_res:
|
||||||
@@ -335,6 +327,16 @@ class NewRenderJobForm(QWidget):
|
|||||||
elif index == 0:
|
elif index == 0:
|
||||||
self.fps_input.setValue(self.project_info.get('fps'))
|
self.fps_input.setValue(self.project_info.get('fps'))
|
||||||
|
|
||||||
|
def update_engine_info(self):
|
||||||
|
# get the engine info and add them all to the ui
|
||||||
|
self.engine_info = self.server_proxy.get_engine_info(response_type='full')
|
||||||
|
self.engine_type.addItems(self.engine_info.keys())
|
||||||
|
# select the best engine for the file type
|
||||||
|
engine = EngineManager.engine_for_project_path(self.project_path)
|
||||||
|
self.engine_type.setCurrentText(engine.name().lower())
|
||||||
|
# refresh ui
|
||||||
|
self.engine_changed()
|
||||||
|
|
||||||
def engine_changed(self):
|
def engine_changed(self):
|
||||||
# load the version numbers
|
# load the version numbers
|
||||||
current_engine = self.engine_type.currentText().lower() or self.engine_type.itemText(0)
|
current_engine = self.engine_type.currentText().lower() or self.engine_type.itemText(0)
|
||||||
@@ -342,13 +344,9 @@ class NewRenderJobForm(QWidget):
|
|||||||
self.engine_version_combo.addItem('latest')
|
self.engine_version_combo.addItem('latest')
|
||||||
self.file_format_combo.clear()
|
self.file_format_combo.clear()
|
||||||
if current_engine:
|
if current_engine:
|
||||||
engine_info = self.server_proxy.get_engine_info(current_engine, 'full', timeout=10)
|
engine_vers = [version_info['version'] for version_info in self.engine_info[current_engine]['versions']]
|
||||||
self.current_engine_options = engine_info.get('options', [])
|
|
||||||
if not engine_info:
|
|
||||||
raise FileNotFoundError(f"Cannot get information about engine '{current_engine}'")
|
|
||||||
engine_vers = [v['version'] for v in engine_info['versions']]
|
|
||||||
self.engine_version_combo.addItems(engine_vers)
|
self.engine_version_combo.addItems(engine_vers)
|
||||||
self.file_format_combo.addItems(engine_info.get('supported_export_formats'))
|
self.file_format_combo.addItems(self.engine_info[current_engine]['supported_export_formats'])
|
||||||
|
|
||||||
def update_server_list(self):
|
def update_server_list(self):
|
||||||
clients = ZeroconfServer.found_hostnames()
|
clients = ZeroconfServer.found_hostnames()
|
||||||
@@ -367,14 +365,14 @@ class NewRenderJobForm(QWidget):
|
|||||||
self.process_label.setHidden(False)
|
self.process_label.setHidden(False)
|
||||||
self.toggle_engine_enablement(False)
|
self.toggle_engine_enablement(False)
|
||||||
|
|
||||||
output_name = Path(self.scene_file_input.text()).stem.replace(' ', '_')
|
output_name, _ = os.path.splitext(os.path.basename(self.scene_file_input.text()))
|
||||||
|
output_name = output_name.replace(' ', '_')
|
||||||
self.job_name_input.setText(output_name)
|
self.job_name_input.setText(output_name)
|
||||||
file_name = self.scene_file_input.text()
|
file_name = self.scene_file_input.text()
|
||||||
|
|
||||||
# setup bg worker
|
# setup bg worker
|
||||||
self.worker_thread = GetProjectInfoWorker(window=self, project_path=file_name)
|
self.worker_thread = GetProjectInfoWorker(window=self, project_path=file_name)
|
||||||
self.worker_thread.message_signal.connect(self.post_get_project_info_update)
|
self.worker_thread.message_signal.connect(self.post_get_project_info_update)
|
||||||
self.worker_thread.error_signal.connect(self.show_error_message)
|
|
||||||
self.worker_thread.start()
|
self.worker_thread.start()
|
||||||
|
|
||||||
def browse_output_path(self):
|
def browse_output_path(self):
|
||||||
@@ -388,26 +386,14 @@ class NewRenderJobForm(QWidget):
|
|||||||
self.engine_help_viewer = EngineHelpViewer(url)
|
self.engine_help_viewer = EngineHelpViewer(url)
|
||||||
self.engine_help_viewer.show()
|
self.engine_help_viewer.show()
|
||||||
|
|
||||||
def show_error_message(self, message):
|
|
||||||
msg = QMessageBox(self)
|
|
||||||
msg.setIcon(QMessageBox.Icon.Critical)
|
|
||||||
msg.setWindowTitle("Error")
|
|
||||||
msg.setText(message)
|
|
||||||
msg.exec()
|
|
||||||
|
|
||||||
# -------- Update --------
|
# -------- Update --------
|
||||||
|
|
||||||
def post_get_project_info_update(self):
|
def post_get_project_info_update(self):
|
||||||
"""Called by the GetProjectInfoWorker - Do not call directly."""
|
"""Called by the GetProjectInfoWorker - Do not call directly."""
|
||||||
try:
|
try:
|
||||||
|
|
||||||
self.engine_type.addItems(self.installed_engines.keys())
|
|
||||||
self.engine_type.setCurrentText(self.preferred_engine)
|
|
||||||
self.engine_changed()
|
|
||||||
|
|
||||||
# Set the best engine we can find
|
# Set the best engine we can find
|
||||||
input_path = self.scene_file_input.text()
|
input_path = self.scene_file_input.text()
|
||||||
engine = EngineManager.engine_class_for_project_path(input_path)
|
engine = EngineManager.engine_for_project_path(input_path)
|
||||||
|
|
||||||
engine_index = self.engine_type.findText(engine.name().lower())
|
engine_index = self.engine_type.findText(engine.name().lower())
|
||||||
if engine_index >= 0:
|
if engine_index >= 0:
|
||||||
@@ -445,11 +431,11 @@ class NewRenderJobForm(QWidget):
|
|||||||
self.cameras_list.item(0).setCheckState(Qt.CheckState.Checked)
|
self.cameras_list.item(0).setCheckState(Qt.CheckState.Checked)
|
||||||
else:
|
else:
|
||||||
self.tabs.setTabEnabled(index, False)
|
self.tabs.setTabEnabled(index, False)
|
||||||
self.update_job_count()
|
|
||||||
|
|
||||||
# Dynamic Engine Options
|
# Dynamic Engine Options
|
||||||
clear_layout(self.engine_options_layout) # clear old options
|
clear_layout(self.engine_options_layout) # clear old options
|
||||||
# dynamically populate option list
|
# dynamically populate option list
|
||||||
|
self.current_engine_options = engine().ui_options()
|
||||||
for option in self.current_engine_options:
|
for option in self.current_engine_options:
|
||||||
h_layout = QHBoxLayout()
|
h_layout = QHBoxLayout()
|
||||||
label = QLabel(option['name'].replace('_', ' ').capitalize() + ':')
|
label = QLabel(option['name'].replace('_', ' ').capitalize() + ':')
|
||||||
@@ -599,16 +585,15 @@ class SubmitWorker(QThread):
|
|||||||
children_jobs.append(child_job_data)
|
children_jobs.append(child_job_data)
|
||||||
job_json['child_jobs'] = children_jobs
|
job_json['child_jobs'] = children_jobs
|
||||||
|
|
||||||
# presubmission tasks - use local installs
|
# presubmission tasks
|
||||||
engine_class = EngineManager.engine_class_with_name(self.window.engine_type.currentText().lower())
|
engine = EngineManager.engine_with_name(self.window.engine_type.currentText().lower())
|
||||||
latest_engine = EngineManager.get_latest_engine_instance(engine_class)
|
input_path = engine().perform_presubmission_tasks(input_path)
|
||||||
input_path = Path(latest_engine.perform_presubmission_tasks(input_path))
|
|
||||||
# submit
|
# submit
|
||||||
err_msg = ""
|
err_msg = ""
|
||||||
result = self.window.server_proxy.post_job_to_server(file_path=input_path, job_data=job_json,
|
result = self.window.server_proxy.post_job_to_server(file_path=input_path, job_data=job_json,
|
||||||
callback=create_callback)
|
callback=create_callback)
|
||||||
if not (result and result.ok):
|
if not (result and result.ok):
|
||||||
err_msg = f"Error posting job to server: {result.text}"
|
err_msg = f"Error posting job to server: {result.message}"
|
||||||
|
|
||||||
self.message_signal.emit(err_msg)
|
self.message_signal.emit(err_msg)
|
||||||
|
|
||||||
@@ -620,7 +605,6 @@ class GetProjectInfoWorker(QThread):
|
|||||||
"""Worker class called to retrieve information about a project file on a background thread and update the UI"""
|
"""Worker class called to retrieve information about a project file on a background thread and update the UI"""
|
||||||
|
|
||||||
message_signal = pyqtSignal()
|
message_signal = pyqtSignal()
|
||||||
error_signal = pyqtSignal(str)
|
|
||||||
|
|
||||||
def __init__(self, window, project_path):
|
def __init__(self, window, project_path):
|
||||||
super().__init__()
|
super().__init__()
|
||||||
@@ -628,19 +612,9 @@ class GetProjectInfoWorker(QThread):
|
|||||||
self.project_path = project_path
|
self.project_path = project_path
|
||||||
|
|
||||||
def run(self):
|
def run(self):
|
||||||
try:
|
engine = EngineManager.engine_for_project_path(self.project_path)
|
||||||
# get the engine info and add them all to the ui
|
self.window.project_info = engine().get_project_info(self.project_path)
|
||||||
self.window.installed_engines = self.window.server_proxy.get_installed_engines()
|
self.message_signal.emit()
|
||||||
# select the best engine for the file type
|
|
||||||
self.window.preferred_engine = self.window.server_proxy.get_engine_for_filename(self.project_path)
|
|
||||||
|
|
||||||
# this should be the only time we use a local engine instead of using the proxy besides submitting
|
|
||||||
engine_class = EngineManager.engine_class_for_project_path(self.project_path)
|
|
||||||
engine = EngineManager.get_latest_engine_instance(engine_class)
|
|
||||||
self.window.project_info = engine.get_project_info(self.project_path)
|
|
||||||
self.message_signal.emit()
|
|
||||||
except Exception as e:
|
|
||||||
self.error_signal.emit(str(e))
|
|
||||||
|
|
||||||
|
|
||||||
def clear_layout(layout):
|
def clear_layout(layout):
|
||||||
|
|||||||
@@ -93,7 +93,7 @@ class EngineBrowserWindow(QMainWindow):
|
|||||||
def update_table(self):
|
def update_table(self):
|
||||||
|
|
||||||
def update_table_worker():
|
def update_table_worker():
|
||||||
raw_server_data = RenderServerProxy(self.hostname).get_all_engine_info()
|
raw_server_data = RenderServerProxy(self.hostname).get_engine_info()
|
||||||
if not raw_server_data:
|
if not raw_server_data:
|
||||||
return
|
return
|
||||||
|
|
||||||
|
|||||||
@@ -7,12 +7,11 @@ import os
|
|||||||
import sys
|
import sys
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
from typing import List, Dict, Any, Optional
|
|
||||||
|
|
||||||
import PIL
|
import PIL
|
||||||
import humanize
|
import humanize
|
||||||
from PIL import Image
|
from PIL import Image
|
||||||
from PyQt6.QtCore import Qt, QByteArray, QBuffer, QIODevice, QThread, pyqtSignal
|
from PyQt6.QtCore import Qt, QByteArray, QBuffer, QIODevice, QThread
|
||||||
from PyQt6.QtGui import QPixmap, QImage, QFont, QIcon
|
from PyQt6.QtGui import QPixmap, QImage, QFont, QIcon
|
||||||
from PyQt6.QtWidgets import QMainWindow, QWidget, QHBoxLayout, QListWidget, QTableWidget, QAbstractItemView, \
|
from PyQt6.QtWidgets import QMainWindow, QWidget, QHBoxLayout, QListWidget, QTableWidget, QAbstractItemView, \
|
||||||
QTableWidgetItem, QLabel, QVBoxLayout, QHeaderView, QMessageBox, QGroupBox, QPushButton, QListWidgetItem, \
|
QTableWidgetItem, QLabel, QVBoxLayout, QHeaderView, QMessageBox, QGroupBox, QPushButton, QListWidgetItem, \
|
||||||
@@ -53,12 +52,12 @@ class MainWindow(QMainWindow):
|
|||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
||||||
# Load the queue
|
# Load the queue
|
||||||
self.job_list_view: Optional[QTableWidget] = None
|
self.job_list_view = None
|
||||||
self.server_info_ram: Optional[str] = None
|
self.server_info_ram = None
|
||||||
self.server_info_cpu: Optional[str] = None
|
self.server_info_cpu = None
|
||||||
self.server_info_os: Optional[str] = None
|
self.server_info_os = None
|
||||||
self.server_info_gpu: Optional[List[Dict[str, Any]]] = None
|
self.server_info_gpu = None
|
||||||
self.server_info_hostname: Optional[str] = None
|
self.server_info_hostname = None
|
||||||
self.engine_browser_window = None
|
self.engine_browser_window = None
|
||||||
self.server_info_group = None
|
self.server_info_group = None
|
||||||
self.current_hostname = None
|
self.current_hostname = None
|
||||||
@@ -100,10 +99,8 @@ class MainWindow(QMainWindow):
|
|||||||
self.create_toolbars()
|
self.create_toolbars()
|
||||||
|
|
||||||
# start background update
|
# start background update
|
||||||
self.found_servers = []
|
self.bg_update_thread = QThread()
|
||||||
self.job_data = {}
|
self.bg_update_thread.run = self.__background_update
|
||||||
self.bg_update_thread = BackgroundUpdater(window=self)
|
|
||||||
self.bg_update_thread.updated_signal.connect(self.update_ui_data)
|
|
||||||
self.bg_update_thread.start()
|
self.bg_update_thread.start()
|
||||||
|
|
||||||
# Setup other windows
|
# Setup other windows
|
||||||
@@ -114,12 +111,7 @@ class MainWindow(QMainWindow):
|
|||||||
# Pick default job
|
# Pick default job
|
||||||
self.job_picked()
|
self.job_picked()
|
||||||
|
|
||||||
def setup_ui(self, main_layout: QVBoxLayout) -> None:
|
def setup_ui(self, main_layout):
|
||||||
"""Setup the main user interface layout.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
main_layout: The main layout container for the UI widgets.
|
|
||||||
"""
|
|
||||||
|
|
||||||
# Servers
|
# Servers
|
||||||
server_list_group = QGroupBox("Available Servers")
|
server_list_group = QGroupBox("Available Servers")
|
||||||
@@ -169,41 +161,45 @@ class MainWindow(QMainWindow):
|
|||||||
self.job_list_view.verticalHeader().setVisible(False)
|
self.job_list_view.verticalHeader().setVisible(False)
|
||||||
self.job_list_view.itemSelectionChanged.connect(self.job_picked)
|
self.job_list_view.itemSelectionChanged.connect(self.job_picked)
|
||||||
self.job_list_view.setEditTriggers(QAbstractItemView.EditTrigger.NoEditTriggers)
|
self.job_list_view.setEditTriggers(QAbstractItemView.EditTrigger.NoEditTriggers)
|
||||||
|
self.refresh_job_headers()
|
||||||
|
|
||||||
# Setup Job Headers
|
# Image Layout
|
||||||
self.job_list_view.setHorizontalHeaderLabels(["ID", "Name", "Engine", "Priority", "Status",
|
image_group = QGroupBox("Job Preview")
|
||||||
"Time Elapsed", "Frames", "Date Created"])
|
image_layout = QVBoxLayout(image_group)
|
||||||
self.job_list_view.setColumnHidden(0, True)
|
image_layout.setContentsMargins(0, 0, 0, 0)
|
||||||
|
image_center_layout = QHBoxLayout()
|
||||||
|
image_center_layout.addWidget(self.image_label)
|
||||||
|
image_layout.addWidget(self.image_label)
|
||||||
|
# image_layout.addLayout(image_center_layout)
|
||||||
|
|
||||||
self.job_list_view.horizontalHeader().setSectionResizeMode(1, QHeaderView.ResizeMode.Stretch)
|
# Job Layout
|
||||||
self.job_list_view.horizontalHeader().setSectionResizeMode(2, QHeaderView.ResizeMode.ResizeToContents)
|
job_list_group = QGroupBox("Render Jobs")
|
||||||
self.job_list_view.horizontalHeader().setSectionResizeMode(3, QHeaderView.ResizeMode.ResizeToContents)
|
|
||||||
self.job_list_view.horizontalHeader().setSectionResizeMode(4, QHeaderView.ResizeMode.ResizeToContents)
|
|
||||||
self.job_list_view.horizontalHeader().setSectionResizeMode(5, QHeaderView.ResizeMode.ResizeToContents)
|
|
||||||
self.job_list_view.horizontalHeader().setSectionResizeMode(6, QHeaderView.ResizeMode.ResizeToContents)
|
|
||||||
self.job_list_view.horizontalHeader().setSectionResizeMode(7, QHeaderView.ResizeMode.ResizeToContents)
|
|
||||||
|
|
||||||
# Job List Layout
|
|
||||||
job_list_group = QGroupBox("Job Preview")
|
|
||||||
job_list_layout = QVBoxLayout(job_list_group)
|
job_list_layout = QVBoxLayout(job_list_group)
|
||||||
job_list_layout.setContentsMargins(0, 0, 0, 0)
|
job_list_layout.setContentsMargins(0, 0, 0, 0)
|
||||||
job_list_layout.addWidget(self.image_label)
|
image_layout.addWidget(self.job_list_view, stretch=True)
|
||||||
job_list_layout.addWidget(self.job_list_view, stretch=True)
|
image_layout.addLayout(job_list_layout)
|
||||||
|
|
||||||
# Add them all to the window
|
# Add them all to the window
|
||||||
main_layout.addLayout(info_layout)
|
main_layout.addLayout(info_layout)
|
||||||
|
|
||||||
right_layout = QVBoxLayout()
|
right_layout = QVBoxLayout()
|
||||||
right_layout.setContentsMargins(0, 0, 0, 0)
|
right_layout.setContentsMargins(0, 0, 0, 0)
|
||||||
right_layout.addWidget(job_list_group)
|
right_layout.addWidget(image_group)
|
||||||
|
# right_layout.addWidget(job_list_group)
|
||||||
main_layout.addLayout(right_layout)
|
main_layout.addLayout(right_layout)
|
||||||
|
|
||||||
def closeEvent(self, event):
|
def __background_update(self):
|
||||||
"""Handle window close event with job running confirmation.
|
while True:
|
||||||
|
try:
|
||||||
|
self.update_servers()
|
||||||
|
self.fetch_jobs()
|
||||||
|
except RuntimeError:
|
||||||
|
pass
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Uncaught exception in background update: {e}")
|
||||||
|
time.sleep(0.5)
|
||||||
|
|
||||||
Args:
|
def closeEvent(self, event):
|
||||||
event: The close event triggered by user.
|
|
||||||
"""
|
|
||||||
running_jobs = len(RenderQueue.running_jobs())
|
running_jobs = len(RenderQueue.running_jobs())
|
||||||
if running_jobs:
|
if running_jobs:
|
||||||
reply = QMessageBox.question(self, "Running Jobs",
|
reply = QMessageBox.question(self, "Running Jobs",
|
||||||
@@ -216,12 +212,7 @@ class MainWindow(QMainWindow):
|
|||||||
else:
|
else:
|
||||||
event.ignore()
|
event.ignore()
|
||||||
|
|
||||||
# -- Server Code -- #
|
# -- Server Code -- #
|
||||||
|
|
||||||
def refresh_job_list(self):
|
|
||||||
"""Refresh the job list display."""
|
|
||||||
self.job_list_view.clearContents()
|
|
||||||
self.bg_update_thread.needs_update = True
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def current_server_proxy(self):
|
def current_server_proxy(self):
|
||||||
@@ -238,7 +229,7 @@ class MainWindow(QMainWindow):
|
|||||||
# Update the current hostname and clear the job list
|
# Update the current hostname and clear the job list
|
||||||
self.current_hostname = new_hostname
|
self.current_hostname = new_hostname
|
||||||
self.job_list_view.setRowCount(0)
|
self.job_list_view.setRowCount(0)
|
||||||
self.refresh_job_list()
|
self.fetch_jobs(clear_table=True)
|
||||||
|
|
||||||
# Select the first row if there are jobs listed
|
# Select the first row if there are jobs listed
|
||||||
if self.job_list_view.rowCount():
|
if self.job_list_view.rowCount():
|
||||||
@@ -290,19 +281,21 @@ class MainWindow(QMainWindow):
|
|||||||
self.server_info_ram.setText(memory_info)
|
self.server_info_ram.setText(memory_info)
|
||||||
self.server_info_gpu.setText(gpu_info)
|
self.server_info_gpu.setText(gpu_info)
|
||||||
|
|
||||||
def update_ui_data(self):
|
def fetch_jobs(self, clear_table=False):
|
||||||
"""Update UI data with current server and job information."""
|
|
||||||
self.update_servers()
|
|
||||||
|
|
||||||
if not self.current_server_proxy:
|
if not self.current_server_proxy:
|
||||||
return
|
return
|
||||||
|
|
||||||
server_job_data = self.job_data.get(self.current_server_proxy.hostname)
|
if clear_table:
|
||||||
if server_job_data:
|
self.job_list_view.clear()
|
||||||
num_jobs = len(server_job_data)
|
self.refresh_job_headers()
|
||||||
|
|
||||||
|
job_fetch = self.current_server_proxy.get_all_jobs(ignore_token=False)
|
||||||
|
if job_fetch:
|
||||||
|
num_jobs = len(job_fetch)
|
||||||
self.job_list_view.setRowCount(num_jobs)
|
self.job_list_view.setRowCount(num_jobs)
|
||||||
|
|
||||||
for row, job in enumerate(server_job_data):
|
for row, job in enumerate(job_fetch):
|
||||||
|
|
||||||
display_status = job['status'] if job['status'] != RenderStatus.RUNNING.value else \
|
display_status = job['status'] if job['status'] != RenderStatus.RUNNING.value else \
|
||||||
('%.0f%%' % (job['percent_complete'] * 100)) # if running, show percent, otherwise just show status
|
('%.0f%%' % (job['percent_complete'] * 100)) # if running, show percent, otherwise just show status
|
||||||
@@ -327,7 +320,7 @@ class MainWindow(QMainWindow):
|
|||||||
for col, item in enumerate(items):
|
for col, item in enumerate(items):
|
||||||
self.job_list_view.setItem(row, col, item)
|
self.job_list_view.setItem(row, col, item)
|
||||||
|
|
||||||
# -- Job Code -- #
|
# -- Job Code -- #
|
||||||
def job_picked(self):
|
def job_picked(self):
|
||||||
|
|
||||||
def fetch_preview(job_id):
|
def fetch_preview(job_id):
|
||||||
@@ -393,11 +386,6 @@ class MainWindow(QMainWindow):
|
|||||||
self.topbar.actions_call['Open Files'].setVisible(False)
|
self.topbar.actions_call['Open Files'].setVisible(False)
|
||||||
|
|
||||||
def selected_job_ids(self):
|
def selected_job_ids(self):
|
||||||
"""Get list of selected job IDs from the job list.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
List[str]: List of selected job ID strings.
|
|
||||||
"""
|
|
||||||
try:
|
try:
|
||||||
selected_rows = self.job_list_view.selectionModel().selectedRows()
|
selected_rows = self.job_list_view.selectionModel().selectedRows()
|
||||||
job_ids = []
|
job_ids = []
|
||||||
@@ -408,16 +396,23 @@ class MainWindow(QMainWindow):
|
|||||||
except AttributeError:
|
except AttributeError:
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
def refresh_job_headers(self):
|
||||||
|
self.job_list_view.setHorizontalHeaderLabels(["ID", "Name", "Engine", "Priority", "Status",
|
||||||
|
"Time Elapsed", "Frames", "Date Created"])
|
||||||
|
self.job_list_view.setColumnHidden(0, True)
|
||||||
|
|
||||||
# -- Image Code -- #
|
self.job_list_view.horizontalHeader().setSectionResizeMode(1, QHeaderView.ResizeMode.Stretch)
|
||||||
|
self.job_list_view.horizontalHeader().setSectionResizeMode(2, QHeaderView.ResizeMode.ResizeToContents)
|
||||||
|
self.job_list_view.horizontalHeader().setSectionResizeMode(3, QHeaderView.ResizeMode.ResizeToContents)
|
||||||
|
self.job_list_view.horizontalHeader().setSectionResizeMode(4, QHeaderView.ResizeMode.ResizeToContents)
|
||||||
|
self.job_list_view.horizontalHeader().setSectionResizeMode(5, QHeaderView.ResizeMode.ResizeToContents)
|
||||||
|
self.job_list_view.horizontalHeader().setSectionResizeMode(6, QHeaderView.ResizeMode.ResizeToContents)
|
||||||
|
self.job_list_view.horizontalHeader().setSectionResizeMode(7, QHeaderView.ResizeMode.ResizeToContents)
|
||||||
|
|
||||||
|
# -- Image Code -- #
|
||||||
|
|
||||||
def load_image_path(self, image_path):
|
def load_image_path(self, image_path):
|
||||||
"""Load and display an image from file path.
|
# Load and set the image using QPixmap
|
||||||
|
|
||||||
Args:
|
|
||||||
image_path: Path to the image file to load.
|
|
||||||
"""
|
|
||||||
# Load and set image using QPixmap
|
|
||||||
try:
|
try:
|
||||||
pixmap = QPixmap(image_path)
|
pixmap = QPixmap(image_path)
|
||||||
if not pixmap:
|
if not pixmap:
|
||||||
@@ -450,25 +445,28 @@ class MainWindow(QMainWindow):
|
|||||||
logger.error(f"Error loading image data: {e}")
|
logger.error(f"Error loading image data: {e}")
|
||||||
|
|
||||||
def update_servers(self):
|
def update_servers(self):
|
||||||
|
found_servers = list(set(ZeroconfServer.found_hostnames() + self.added_hostnames))
|
||||||
|
found_servers = [x for x in found_servers if ZeroconfServer.get_hostname_properties(x)['api_version'] == API_VERSION]
|
||||||
|
|
||||||
# Always make sure local hostname is first
|
# Always make sure local hostname is first
|
||||||
if self.found_servers and not is_localhost(self.found_servers[0]):
|
if found_servers and not is_localhost(found_servers[0]):
|
||||||
for hostname in self.found_servers:
|
for hostname in found_servers:
|
||||||
if is_localhost(hostname):
|
if is_localhost(hostname):
|
||||||
self.found_servers.remove(hostname)
|
found_servers.remove(hostname)
|
||||||
self.found_servers.insert(0, hostname)
|
found_servers.insert(0, hostname)
|
||||||
break
|
break
|
||||||
|
|
||||||
old_count = self.server_list_view.count()
|
old_count = self.server_list_view.count()
|
||||||
|
|
||||||
# Update proxys
|
# Update proxys
|
||||||
for hostname in self.found_servers:
|
for hostname in found_servers:
|
||||||
ServerProxyManager.get_proxy_for_hostname(hostname) # setup background updates
|
ServerProxyManager.get_proxy_for_hostname(hostname) # setup background updates
|
||||||
|
|
||||||
# Add in all the missing servers
|
# Add in all the missing servers
|
||||||
current_server_list = []
|
current_server_list = []
|
||||||
for i in range(self.server_list_view.count()):
|
for i in range(self.server_list_view.count()):
|
||||||
current_server_list.append(self.server_list_view.item(i).text())
|
current_server_list.append(self.server_list_view.item(i).text())
|
||||||
for hostname in self.found_servers:
|
for hostname in found_servers:
|
||||||
if hostname not in current_server_list:
|
if hostname not in current_server_list:
|
||||||
properties = ZeroconfServer.get_hostname_properties(hostname)
|
properties = ZeroconfServer.get_hostname_properties(hostname)
|
||||||
image_path = os.path.join(resources_dir(), f"{properties.get('system_os', 'Monitor')}.png")
|
image_path = os.path.join(resources_dir(), f"{properties.get('system_os', 'Monitor')}.png")
|
||||||
@@ -479,7 +477,7 @@ class MainWindow(QMainWindow):
|
|||||||
servers_to_remove = []
|
servers_to_remove = []
|
||||||
for i in range(self.server_list_view.count()):
|
for i in range(self.server_list_view.count()):
|
||||||
name = self.server_list_view.item(i).text()
|
name = self.server_list_view.item(i).text()
|
||||||
if name not in self.found_servers:
|
if name not in found_servers:
|
||||||
servers_to_remove.append(name)
|
servers_to_remove.append(name)
|
||||||
|
|
||||||
# remove any servers that shouldn't be shown any longer
|
# remove any servers that shouldn't be shown any longer
|
||||||
@@ -526,7 +524,7 @@ class MainWindow(QMainWindow):
|
|||||||
"New Job", f"{resources_directory}/AddProduct.png", self.new_job)
|
"New Job", f"{resources_directory}/AddProduct.png", self.new_job)
|
||||||
self.addToolBar(Qt.ToolBarArea.TopToolBarArea, self.topbar)
|
self.addToolBar(Qt.ToolBarArea.TopToolBarArea, self.topbar)
|
||||||
|
|
||||||
# -- Toolbar Buttons -- #
|
# -- Toolbar Buttons -- #
|
||||||
|
|
||||||
def open_console_window(self) -> None:
|
def open_console_window(self) -> None:
|
||||||
"""
|
"""
|
||||||
@@ -541,9 +539,8 @@ class MainWindow(QMainWindow):
|
|||||||
self.engine_browser_window.show()
|
self.engine_browser_window.show()
|
||||||
|
|
||||||
def job_logs(self) -> None:
|
def job_logs(self) -> None:
|
||||||
"""Open log viewer for selected job.
|
"""
|
||||||
|
Event handler for the "Logs" button.
|
||||||
Opens a log viewer window showing the logs for the currently selected job.
|
|
||||||
"""
|
"""
|
||||||
selected_job_ids = self.selected_job_ids()
|
selected_job_ids = self.selected_job_ids()
|
||||||
if selected_job_ids:
|
if selected_job_ids:
|
||||||
@@ -552,10 +549,8 @@ class MainWindow(QMainWindow):
|
|||||||
self.log_viewer_window.show()
|
self.log_viewer_window.show()
|
||||||
|
|
||||||
def stop_job(self, event):
|
def stop_job(self, event):
|
||||||
"""Stop selected render jobs with user confirmation.
|
"""
|
||||||
|
Event handler for the Stop Job button
|
||||||
Args:
|
|
||||||
event: The button click event.
|
|
||||||
"""
|
"""
|
||||||
job_ids = self.selected_job_ids()
|
job_ids = self.selected_job_ids()
|
||||||
if not job_ids:
|
if not job_ids:
|
||||||
@@ -565,7 +560,7 @@ class MainWindow(QMainWindow):
|
|||||||
job = next((job for job in self.current_server_proxy.get_all_jobs() if job.get('id') == job_ids[0]), None)
|
job = next((job for job in self.current_server_proxy.get_all_jobs() if job.get('id') == job_ids[0]), None)
|
||||||
if job:
|
if job:
|
||||||
display_name = job.get('name', os.path.basename(job.get('input_path', '')))
|
display_name = job.get('name', os.path.basename(job.get('input_path', '')))
|
||||||
message = f"Are you sure you want to stop job: {display_name}?"
|
message = f"Are you sure you want to stop the job:\n{display_name}?"
|
||||||
else:
|
else:
|
||||||
return # Job not found, handle this case as needed
|
return # Job not found, handle this case as needed
|
||||||
else:
|
else:
|
||||||
@@ -578,13 +573,11 @@ class MainWindow(QMainWindow):
|
|||||||
if msg_box.exec() == QMessageBox.StandardButton.Yes:
|
if msg_box.exec() == QMessageBox.StandardButton.Yes:
|
||||||
for job_id in job_ids:
|
for job_id in job_ids:
|
||||||
self.current_server_proxy.cancel_job(job_id, confirm=True)
|
self.current_server_proxy.cancel_job(job_id, confirm=True)
|
||||||
self.refresh_job_list()
|
self.fetch_jobs(clear_table=True)
|
||||||
|
|
||||||
def delete_job(self, event):
|
def delete_job(self, event):
|
||||||
"""Delete selected render jobs with user confirmation.
|
"""
|
||||||
|
Event handler for the Delete Job button
|
||||||
Args:
|
|
||||||
event: The button click event.
|
|
||||||
"""
|
"""
|
||||||
job_ids = self.selected_job_ids()
|
job_ids = self.selected_job_ids()
|
||||||
if not job_ids:
|
if not job_ids:
|
||||||
@@ -607,7 +600,7 @@ class MainWindow(QMainWindow):
|
|||||||
if msg_box.exec() == QMessageBox.StandardButton.Yes:
|
if msg_box.exec() == QMessageBox.StandardButton.Yes:
|
||||||
for job_id in job_ids:
|
for job_id in job_ids:
|
||||||
self.current_server_proxy.delete_job(job_id, confirm=True)
|
self.current_server_proxy.delete_job(job_id, confirm=True)
|
||||||
self.refresh_job_list()
|
self.fetch_jobs(clear_table=True)
|
||||||
|
|
||||||
def download_files(self, event):
|
def download_files(self, event):
|
||||||
|
|
||||||
@@ -638,41 +631,6 @@ class MainWindow(QMainWindow):
|
|||||||
self.new_job_window.show()
|
self.new_job_window.show()
|
||||||
|
|
||||||
|
|
||||||
class BackgroundUpdater(QThread):
|
|
||||||
"""Worker class to fetch job and server information and update the UI"""
|
|
||||||
|
|
||||||
updated_signal = pyqtSignal()
|
|
||||||
error_signal = pyqtSignal(str)
|
|
||||||
|
|
||||||
def __init__(self, window):
|
|
||||||
super().__init__()
|
|
||||||
self.window = window
|
|
||||||
self.needs_update = True
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
"""Main background thread execution loop.
|
|
||||||
|
|
||||||
Continuously fetches server and job data, updating the main UI
|
|
||||||
every second or when updates are needed.
|
|
||||||
"""
|
|
||||||
try:
|
|
||||||
last_run = 0
|
|
||||||
while True:
|
|
||||||
now = time.monotonic()
|
|
||||||
if now - last_run >= 1.0 or self.needs_update:
|
|
||||||
self.window.found_servers = list(set(ZeroconfServer.found_hostnames() + self.window.added_hostnames))
|
|
||||||
self.window.found_servers = [x for x in self.window.found_servers if
|
|
||||||
ZeroconfServer.get_hostname_properties(x)['api_version'] == API_VERSION]
|
|
||||||
if self.window.current_server_proxy:
|
|
||||||
self.window.job_data[self.window.current_server_proxy.hostname] = \
|
|
||||||
self.window.current_server_proxy.get_all_jobs(ignore_token=False)
|
|
||||||
self.needs_update = False
|
|
||||||
self.updated_signal.emit()
|
|
||||||
time.sleep(0.05)
|
|
||||||
except Exception as e:
|
|
||||||
print(f"ERROR: {e}")
|
|
||||||
self.error_signal.emit(str(e))
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
# lazy load GUI frameworks
|
# lazy load GUI frameworks
|
||||||
from PyQt6.QtWidgets import QApplication
|
from PyQt6.QtWidgets import QApplication
|
||||||
|
|||||||
@@ -15,7 +15,7 @@ from PyQt6.QtWidgets import QApplication, QMainWindow, QListWidget, QListWidgetI
|
|||||||
from src.api.server_proxy import RenderServerProxy
|
from src.api.server_proxy import RenderServerProxy
|
||||||
from src.engines.engine_manager import EngineManager
|
from src.engines.engine_manager import EngineManager
|
||||||
from src.utilities.config import Config
|
from src.utilities.config import Config
|
||||||
from src.utilities.misc_helper import launch_url
|
from src.utilities.misc_helper import launch_url, system_safe_path
|
||||||
from src.version import APP_AUTHOR, APP_NAME
|
from src.version import APP_AUTHOR, APP_NAME
|
||||||
|
|
||||||
settings = QSettings(APP_AUTHOR, APP_NAME)
|
settings = QSettings(APP_AUTHOR, APP_NAME)
|
||||||
@@ -37,7 +37,7 @@ class GetEngineInfoWorker(QThread):
|
|||||||
self.parent = parent
|
self.parent = parent
|
||||||
|
|
||||||
def run(self):
|
def run(self):
|
||||||
data = RenderServerProxy(socket.gethostname()).get_all_engine_info()
|
data = RenderServerProxy(socket.gethostname()).get_engine_info()
|
||||||
self.done.emit(data)
|
self.done.emit(data)
|
||||||
|
|
||||||
class SettingsWindow(QMainWindow):
|
class SettingsWindow(QMainWindow):
|
||||||
@@ -59,7 +59,9 @@ class SettingsWindow(QMainWindow):
|
|||||||
self.check_for_new_engines_button = None
|
self.check_for_new_engines_button = None
|
||||||
|
|
||||||
if not EngineManager.engines_path: # fix issue where sometimes path was not set
|
if not EngineManager.engines_path: # fix issue where sometimes path was not set
|
||||||
EngineManager.engines_path = Path(Config.upload_folder).expanduser() / "engines"
|
EngineManager.engines_path = system_safe_path(
|
||||||
|
os.path.join(os.path.join(os.path.expanduser(Config.upload_folder),
|
||||||
|
'engines')))
|
||||||
|
|
||||||
self.installed_engines_table = None
|
self.installed_engines_table = None
|
||||||
|
|
||||||
@@ -411,7 +413,7 @@ class SettingsWindow(QMainWindow):
|
|||||||
msg_result = msg_box.exec()
|
msg_result = msg_box.exec()
|
||||||
messagebox_shown = True
|
messagebox_shown = True
|
||||||
if msg_result == QMessageBox.StandardButton.Yes:
|
if msg_result == QMessageBox.StandardButton.Yes:
|
||||||
EngineManager.download_engine(engine_name=engine.name(), version=result['version'], background=True,
|
EngineManager.download_engine(engine=engine.name(), version=result['version'], background=True,
|
||||||
ignore_system=ignore_system)
|
ignore_system=ignore_system)
|
||||||
self.engine_download_progress_bar.setHidden(False)
|
self.engine_download_progress_bar.setHidden(False)
|
||||||
self.engine_download_progress_bar.setValue(0)
|
self.engine_download_progress_bar.setValue(0)
|
||||||
|
|||||||
@@ -1,6 +1,4 @@
|
|||||||
import os
|
import os
|
||||||
from pathlib import Path
|
|
||||||
|
|
||||||
import yaml
|
import yaml
|
||||||
from src.utilities.misc_helper import current_system_os, copy_directory_contents
|
from src.utilities.misc_helper import current_system_os, copy_directory_contents
|
||||||
|
|
||||||
@@ -25,7 +23,7 @@ class Config:
|
|||||||
with open(config_path, 'r') as ymlfile:
|
with open(config_path, 'r') as ymlfile:
|
||||||
cfg = yaml.safe_load(ymlfile)
|
cfg = yaml.safe_load(ymlfile)
|
||||||
|
|
||||||
cls.upload_folder = str(Path(cfg.get('upload_folder', cls.upload_folder)).expanduser())
|
cls.upload_folder = os.path.expanduser(cfg.get('upload_folder', cls.upload_folder))
|
||||||
cls.update_engines_on_launch = cfg.get('update_engines_on_launch', cls.update_engines_on_launch)
|
cls.update_engines_on_launch = cfg.get('update_engines_on_launch', cls.update_engines_on_launch)
|
||||||
cls.max_content_path = cfg.get('max_content_path', cls.max_content_path)
|
cls.max_content_path = cfg.get('max_content_path', cls.max_content_path)
|
||||||
cls.server_log_level = cfg.get('server_log_level', cls.server_log_level)
|
cls.server_log_level = cfg.get('server_log_level', cls.server_log_level)
|
||||||
@@ -39,14 +37,14 @@ class Config:
|
|||||||
cls.download_timeout_seconds = cfg.get('download_timeout_seconds', cls.download_timeout_seconds)
|
cls.download_timeout_seconds = cfg.get('download_timeout_seconds', cls.download_timeout_seconds)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def config_dir(cls) -> Path:
|
def config_dir(cls):
|
||||||
# Set up the config path
|
# Set up the config path
|
||||||
if current_system_os() == 'macos':
|
if current_system_os() == 'macos':
|
||||||
local_config_path = Path('~/Library/Application Support/Zordon').expanduser()
|
local_config_path = os.path.expanduser('~/Library/Application Support/Zordon')
|
||||||
elif current_system_os() == 'windows':
|
elif current_system_os() == 'windows':
|
||||||
local_config_path = Path(os.environ['APPDATA']) / 'Zordon'
|
local_config_path = os.path.join(os.environ['APPDATA'], 'Zordon')
|
||||||
else:
|
else:
|
||||||
local_config_path = Path('~/.config/Zordon').expanduser()
|
local_config_path = os.path.expanduser('~/.config/Zordon')
|
||||||
return local_config_path
|
return local_config_path
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
@@ -63,9 +61,10 @@ class Config:
|
|||||||
# Determine the template path
|
# Determine the template path
|
||||||
resource_environment_path = os.environ.get('RESOURCEPATH')
|
resource_environment_path = os.environ.get('RESOURCEPATH')
|
||||||
if resource_environment_path:
|
if resource_environment_path:
|
||||||
template_path = Path(resource_environment_path) / 'config'
|
template_path = os.path.join(resource_environment_path, 'config')
|
||||||
else:
|
else:
|
||||||
template_path = Path(__file__).resolve().parents[2] / 'config'
|
template_path = os.path.join(
|
||||||
|
os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), 'config')
|
||||||
|
|
||||||
# Copy contents from the template to the local configuration directory
|
# Copy contents from the template to the local configuration directory
|
||||||
copy_directory_contents(template_path, local_config_dir)
|
copy_directory_contents(template_path, local_config_dir)
|
||||||
|
|||||||
@@ -9,12 +9,11 @@ import string
|
|||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from typing import Optional, List, Dict, Any
|
|
||||||
|
|
||||||
logger = logging.getLogger()
|
logger = logging.getLogger()
|
||||||
|
|
||||||
|
|
||||||
def launch_url(url: str) -> None:
|
def launch_url(url):
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
if shutil.which('xdg-open'):
|
if shutil.which('xdg-open'):
|
||||||
@@ -38,7 +37,7 @@ def launch_url(url: str) -> None:
|
|||||||
logger.error(f"Failed to launch URL: {url}. Error: {e}")
|
logger.error(f"Failed to launch URL: {url}. Error: {e}")
|
||||||
|
|
||||||
|
|
||||||
def file_exists_in_mounts(filepath: str) -> Optional[str]:
|
def file_exists_in_mounts(filepath):
|
||||||
"""
|
"""
|
||||||
Check if a file exists in any mounted directory.
|
Check if a file exists in any mounted directory.
|
||||||
It searches for the file in common mount points like '/Volumes', '/mnt', and '/media'.
|
It searches for the file in common mount points like '/Volumes', '/mnt', and '/media'.
|
||||||
@@ -79,7 +78,7 @@ def file_exists_in_mounts(filepath: str) -> Optional[str]:
|
|||||||
return possible_mount_path
|
return possible_mount_path
|
||||||
|
|
||||||
|
|
||||||
def get_time_elapsed(start_time: Optional[datetime] = None, end_time: Optional[datetime] = None) -> str:
|
def get_time_elapsed(start_time=None, end_time=None):
|
||||||
|
|
||||||
def strfdelta(tdelta, fmt='%H:%M:%S'):
|
def strfdelta(tdelta, fmt='%H:%M:%S'):
|
||||||
days = tdelta.days
|
days = tdelta.days
|
||||||
@@ -106,7 +105,7 @@ def get_time_elapsed(start_time: Optional[datetime] = None, end_time: Optional[d
|
|||||||
return elapsed_time_string
|
return elapsed_time_string
|
||||||
|
|
||||||
|
|
||||||
def get_file_size_human(file_path: str) -> str:
|
def get_file_size_human(file_path):
|
||||||
size_in_bytes = os.path.getsize(file_path)
|
size_in_bytes = os.path.getsize(file_path)
|
||||||
|
|
||||||
# Convert size to a human-readable format
|
# Convert size to a human-readable format
|
||||||
@@ -122,19 +121,26 @@ def get_file_size_human(file_path: str) -> str:
|
|||||||
return f"{size_in_bytes / 1024 ** 4:.2f} TB"
|
return f"{size_in_bytes / 1024 ** 4:.2f} TB"
|
||||||
|
|
||||||
|
|
||||||
def current_system_os() -> str:
|
# Convert path to the appropriate format for the current platform
|
||||||
|
def system_safe_path(path):
|
||||||
|
if platform.system().lower() == "windows":
|
||||||
|
return os.path.normpath(path)
|
||||||
|
return path.replace("\\", "/")
|
||||||
|
|
||||||
|
|
||||||
|
def current_system_os():
|
||||||
return platform.system().lower().replace('darwin', 'macos')
|
return platform.system().lower().replace('darwin', 'macos')
|
||||||
|
|
||||||
|
|
||||||
def current_system_os_version() -> str:
|
def current_system_os_version():
|
||||||
return platform.release()
|
return platform.mac_ver()[0] if current_system_os() == 'macos' else platform.release().lower()
|
||||||
|
|
||||||
|
|
||||||
def current_system_cpu() -> str:
|
def current_system_cpu():
|
||||||
return platform.machine().lower().replace('amd64', 'x64')
|
# convert all x86 64 to "x64"
|
||||||
|
return platform.machine().lower().replace('amd64', 'x64').replace('x86_64', 'x64')
|
||||||
|
|
||||||
|
def current_system_cpu_brand():
|
||||||
def current_system_cpu_brand() -> str:
|
|
||||||
"""Fast cross-platform CPU brand string"""
|
"""Fast cross-platform CPU brand string"""
|
||||||
if sys.platform.startswith('darwin'): # macOS
|
if sys.platform.startswith('darwin'): # macOS
|
||||||
try:
|
try:
|
||||||
@@ -169,21 +175,28 @@ def current_system_cpu_brand() -> str:
|
|||||||
# Ultimate fallback
|
# Ultimate fallback
|
||||||
return platform.processor() or 'Unknown CPU'
|
return platform.processor() or 'Unknown CPU'
|
||||||
|
|
||||||
def resources_dir() -> str:
|
def resources_dir():
|
||||||
return os.path.join(os.path.dirname(__file__), '..', '..', 'resources')
|
resource_environment_path = os.environ.get('RESOURCEPATH', None)
|
||||||
|
if resource_environment_path: # running inside resource bundle
|
||||||
|
return os.path.join(resource_environment_path, 'resources')
|
||||||
|
else:
|
||||||
|
return os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), 'resources')
|
||||||
|
|
||||||
|
|
||||||
def copy_directory_contents(src_dir: str, dst_dir: str) -> None:
|
def copy_directory_contents(src_dir, dst_dir):
|
||||||
|
"""
|
||||||
|
Copy the contents of the source directory (src_dir) to the destination directory (dst_dir).
|
||||||
|
"""
|
||||||
for item in os.listdir(src_dir):
|
for item in os.listdir(src_dir):
|
||||||
src_path = os.path.join(src_dir, item)
|
src_path = os.path.join(src_dir, item)
|
||||||
dst_path = os.path.join(dst_dir, item)
|
dst_path = os.path.join(dst_dir, item)
|
||||||
if os.path.isdir(src_path):
|
if os.path.isdir(src_path):
|
||||||
shutil.copytree(src_path, dst_path)
|
shutil.copytree(src_path, dst_path, dirs_exist_ok=True)
|
||||||
else:
|
else:
|
||||||
shutil.copy2(src_path, dst_path)
|
shutil.copy2(src_path, dst_path)
|
||||||
|
|
||||||
|
|
||||||
def check_for_updates(repo_name: str, repo_owner: str, app_name: str, current_version: str) -> Optional[Dict[str, Any]]:
|
def check_for_updates(repo_name, repo_owner, app_name, current_version):
|
||||||
def get_github_releases(owner, repo):
|
def get_github_releases(owner, repo):
|
||||||
import requests
|
import requests
|
||||||
url = f"https://api.github.com/repos/{owner}/{repo}/releases"
|
url = f"https://api.github.com/repos/{owner}/{repo}/releases"
|
||||||
@@ -210,15 +223,33 @@ def check_for_updates(repo_name: str, repo_owner: str, app_name: str, current_ve
|
|||||||
return latest_version
|
return latest_version
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def is_localhost(comparison_hostname: str) -> bool:
|
def is_localhost(comparison_hostname):
|
||||||
return comparison_hostname in ['localhost', '127.0.0.1', socket.gethostname()]
|
# this is necessary because socket.gethostname() does not always include '.local' - This is a sanitized comparison
|
||||||
|
try:
|
||||||
|
comparison_hostname = comparison_hostname.lower().replace('.local', '')
|
||||||
|
local_hostname = socket.gethostname().lower().replace('.local', '')
|
||||||
|
return comparison_hostname == local_hostname
|
||||||
|
except AttributeError:
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
def num_to_alphanumeric(num: int) -> str:
|
def num_to_alphanumeric(num):
|
||||||
return string.ascii_letters[num % 26] + str(num // 26)
|
# List of possible alphanumeric characters
|
||||||
|
characters = string.ascii_letters + string.digits
|
||||||
|
|
||||||
|
# Make sure number is positive
|
||||||
|
num = abs(num)
|
||||||
|
|
||||||
|
# Convert number to alphanumeric
|
||||||
|
result = ""
|
||||||
|
while num > 0:
|
||||||
|
num, remainder = divmod(num, len(characters))
|
||||||
|
result += characters[remainder]
|
||||||
|
|
||||||
|
return result[::-1] # Reverse the result to get the correct alphanumeric string
|
||||||
|
|
||||||
|
|
||||||
def get_gpu_info() -> List[Dict[str, Any]]:
|
def get_gpu_info():
|
||||||
"""Cross-platform GPU information retrieval"""
|
"""Cross-platform GPU information retrieval"""
|
||||||
|
|
||||||
def get_windows_gpu_info():
|
def get_windows_gpu_info():
|
||||||
|
|||||||
Reference in New Issue
Block a user