6 Commits

Author SHA1 Message Date
Brett Williams
a3e2fa7e07 Update the renderer to reflect the current file type 2023-11-17 08:18:07 -06:00
Brett Williams
23901bc8e4 Fix add_job crashing 2023-11-16 14:36:09 -06:00
Brett Williams
ba996c58f5 Make sure supported_extensions is now called as a method everywhere 2023-11-16 14:01:48 -06:00
Brett Williams
9e8eb77328 Cleanup extension matching 2023-11-16 13:55:49 -06:00
Brett Williams
81d2cb70b8 Cleanup unnecessary code in FFMPEG 2023-11-16 11:27:30 -06:00
Brett Williams
6dc8db2d8c Make sure progress UI updates occur on main thread 2023-11-16 06:13:12 -06:00
92 changed files with 2748 additions and 6010 deletions

11
.flake8
View File

@@ -1,11 +0,0 @@
[flake8]
exclude =
src/engines/aerender
.git
build
dist
*.egg
venv
.venv
max-complexity = 10
max-line-length = 127

View File

@@ -1,38 +0,0 @@
name: Create Executables
on:
workflow_dispatch:
release:
- types: [created]
jobs:
pyinstaller-build-windows:
runs-on: windows-latest
steps:
- name: Create Executables (Windows)
uses: sayyid5416/pyinstaller@v1
with:
python_ver: '3.11'
spec: 'client.spec'
requirements: 'requirements.txt'
upload_exe_with_name: 'Zordon'
pyinstaller-build-linux:
runs-on: ubuntu-latest
steps:
- name: Create Executables (Linux)
uses: sayyid5416/pyinstaller@v1
with:
python_ver: '3.11'
spec: 'client.spec'
requirements: 'requirements.txt'
upload_exe_with_name: 'Zordon'
pyinstaller-build-macos:
runs-on: macos-latest
steps:
- name: Create Executables (macOS)
uses: sayyid5416/pyinstaller@v1
with:
python_ver: '3.11'
spec: 'client.spec'
requirements: 'requirements.txt'
upload_exe_with_name: 'Zordon'

23
.github/workflows/pylint.yml vendored Normal file
View File

@@ -0,0 +1,23 @@
name: Pylint
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install pylint
- name: Analysing the code with pylint
run: |
pylint $(git ls-files '*.py')

View File

@@ -34,7 +34,6 @@ jobs:
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
continue-on-error: false
# - name: Test with pytest
# run: |
# pytest
- name: Test with pytest
run: |
pytest

14
.gitignore vendored
View File

@@ -1,16 +1,8 @@
/job_history.json
*.icloud
*.fcpxml
/uploads
*.pyc
/server_state.json
/.scheduler_prefs
*.db
/dist/
/build/
/.github/
*.idea
.DS_Store
/venv/
.env
venv/
/.eggs/
/.ai/
/.github/

View File

@@ -1,5 +0,0 @@
[MASTER]
max-line-length = 120
ignore-paths=^src/engines/aerender/
[MESSAGES CONTROL]
disable = missing-docstring, invalid-name, import-error, logging-fstring-interpolation

View File

@@ -1,21 +0,0 @@
MIT License
Copyright (c) 2024 Brett Williams
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

197
README.md
View File

@@ -1,196 +1,23 @@
![Zordon Screenshot](docs/screenshot.png)
# 🎬 Zordon - Render Management Tools 🎬
---
Welcome to Zordon! This is a hobby project written with fellow filmmakers in mind. It's a local network render farm manager, aiming to streamline and simplify the rendering process across multiple home computers.
# Zordon
## 📦 Installation
A Python-based distributed rendering management tool that supports Blender, FFmpeg, and other render engines. Zordon efficiently manages render jobs across multiple machines, making it ideal for small render farms in home studios or small businesses.
Make sure to install the necessary dependencies: `pip3 install -r requirements.txt`
## What is Zordon?
## 🚀 How to Use
Zordon is a tool designed for small render farms, such as those used in home studios or small businesses, to efficiently manage and run render jobs for Blender, FFmpeg, and other video renderers. It simplifies the process of distributing rendering tasks across multiple available machines, optimizing the rendering workflow for artists, animators, and video professionals.
Zordon has two main files: `start_server.py` and `start_client.py`.
The system works by:
- **Server**: Central coordinator that manages job queues and distributes tasks to available workers
- **Clients**: Lightweight workers that run on rendering machines and execute assigned jobs
- **API**: RESTful endpoints for programmatic job submission and monitoring
- **start_server.py**: Run this on any computer you want to render jobs. It manages the incoming job queue and kicks off the appropriate render jobs when ready.
- **start_client.py**: Run this to administer your render servers. It lets you manage and submit jobs.
## Features
When the server is running, the job queue can be accessed via a web browser on the server's hostname (default port is 8080). You can also access it via the GUI client or a simple view-only dashboard.
- **Distributed Rendering**: Queue and distribute render jobs across multiple machines
- **Multi-Engine Support**: Compatible with Blender, FFmpeg, and extensible to other render engines
- **Desktop UI**: PyQt6 interface for job management and monitoring
- **REST API**: Flask-based API for programmatic access
- **Cross-Platform**: Runs on Windows, macOS, and Linux
- **Zero-Install Clients**: Lightweight client executables for worker machines
## 🎨 Supported Renderers
## Installation
### Prerequisites
- Python 3.11 or later
- Git
### Setup
1. Clone the repository:
```bash
git clone https://github.com/blw1138/Zordon.git
cd Zordon
```
2. Install dependencies:
```bash
pip install -r requirements.txt
```
3. (Optional) Install PyInstaller for building executables:
```bash
pip install pyinstaller pyinstaller_versionfile
```
## Usage
### Quick Start
1. **Start the Server**: Run the central server to coordinate jobs.
```bash
python server.py
```
2. **Launch Clients**: On each rendering machine, run the client to connect to the server.
```bash
python client.py
```
### Detailed Workflow
#### Setting Up a Render Farm
1. Choose one machine as the server (preferably a dedicated machine with good network connectivity).
2. Build and distribute client executables to worker machines:
```bash
pyinstaller client.spec
```
Copy the generated executable to each worker machine.
3. Ensure all machines can communicate via network (same subnet recommended).
#### Submitting Render Jobs
Jobs can be submitted via the desktop UI or programmatically via the API:
- **Via UI**: Use the desktop interface to upload project files, specify render settings, and queue jobs.
- **Via API**: Send POST requests to `/api/jobs` with job configuration in JSON format.
Example API request:
```bash
curl -X POST http://localhost:5000/api/jobs \
-H "Content-Type: application/json" \
-d '{
"engine": "blender",
"project_path": "/path/to/project.blend",
"output_path": "/path/to/output",
"frames": "1-100",
"settings": {"resolution": "1920x1080"}
}'
```
#### Monitoring and Managing Jobs
- **UI**: View job status, progress, logs, and worker availability in real-time.
- **API Endpoints**:
- `GET /api/jobs`: List all jobs
- `GET /api/jobs/{id}`: Get job details
- `DELETE /api/jobs/{id}`: Cancel a job
- `GET /api/workers`: List connected workers
#### Worker Management
Workers automatically connect to the server when started. You can:
- View worker status and capabilities in the dashboard
- Configure worker priorities and resource limits
- Monitor CPU/GPU usage per worker
### Development Mode
For development and testing:
Run the server:
```bash
python server.py
```
Run a client (can run multiple for testing):
```bash
python client.py
```
### Building Executables
Build server executable:
```bash
pyinstaller server.spec
```
Build client executable:
```bash
pyinstaller client.spec
```
## Configuration
Settings are stored in `src/utilities/config.py`. Supports YAML/JSON for data serialization and environment-specific configurations.
## Architecture
Zordon follows a modular architecture with the following key components:
- **API Server** (`src/api/`): Flask-based REST API
- **Engine System** (`src/engines/`): Pluggable render engines (Blender, FFmpeg, etc.)
- **UI** (`src/ui/`): PyQt6-based interface
- **Job Management** (`src/render_queue.py`): Distributed job queue
Design patterns include Factory Pattern for engine creation, Observer Pattern for status updates, and Strategy Pattern for different worker implementations.
## Contributing
1. Fork the repository
2. Create a feature branch: `git checkout -b feature/your-feature`
3. Follow the code style guidelines in `AGENTS.md`
4. Test the build: `pyinstaller server.spec`
5. Submit a pull request
### Commit Message Format
```
feat: add support for new render engine
fix: resolve crash when engine path is invalid
docs: update API documentation
refactor: simplify job status handling
```
## Supported Render Engines
Zordon currently supports the following renderers:
- **Blender**
- **FFmpeg**
- **Adobe After Effects** (planned)
- **Cinema 4D** (planned)
- **Autodesk Maya** (planned)
## System Requirements
- Windows 10 or later
- macOS Ventura (13.0) or later
- Linux (supported versions TBD)
## License
Zordon is licensed under the MIT License. See the [LICENSE](LICENSE.txt) file for more details.
## Notice
This software is in beta and intended for casual/hobbyist use. Not recommended for mission-critical environments.
- **FFMPEG**

View File

@@ -1,131 +0,0 @@
#!/usr/bin/env python3
import argparse
import logging
import os
import socket
import sys
import time
from server import ZordonServer
from src.api.serverproxy_manager import ServerProxyManager
logger = logging.getLogger()
def main():
parser = argparse.ArgumentParser(
description="Zordon CLI tool for preparing/submitting a render job",
formatter_class=argparse.ArgumentDefaultsHelpFormatter
)
# Required arguments
parser.add_argument("scene_file", help="Path to the scene file (e.g., .blend, .max, .mp4)")
parser.add_argument("engine", help="Desired render engine", choices=['blender', 'ffmpeg'])
# Frame range
parser.add_argument("--start", type=int, default=1, help="Start frame")
parser.add_argument("--end", type=int, default=1, help="End frame")
# Job metadata
parser.add_argument("--name", default=None, help="Job name")
# Output
parser.add_argument("--output", default="", help="Output path/pattern (e.g., /renders/frame_####.exr)")
# Target OS and Engine Version
parser.add_argument(
"--os",
choices=["any", "windows", "linux", "macos"],
default="any",
help="Target operating system for render workers"
)
parser.add_argument(
"--engine-version",
default="latest",
help="Required renderer/engine version number (e.g., '4.2', '5.0')"
)
# Optional flags
parser.add_argument("--dry-run", action="store_true", help="Print job details without submitting")
args = parser.parse_args()
# Basic validation
if not os.path.exists(args.scene_file):
print(f"Error: Scene file '{args.scene_file}' not found!", file=sys.stderr)
sys.exit(1)
if args.start > args.end:
print("Error: Start frame cannot be greater than end frame!", file=sys.stderr)
sys.exit(1)
# Calculate total frames
total_frames = len(range(args.start, args.end + 1))
job_name = args.name or os.path.basename(args.scene_file)
file_path = os.path.abspath(args.scene_file)
# Print job summary
print("Render Job Summary:")
print(f" Job Name : {job_name}")
print(f" Scene File : {file_path}")
print(f" Engine : {args.engine}")
print(f" Frames : {args.start}-{args.end}{total_frames} frames")
print(f" Output Path : {args.output or '(default from scene)'}")
print(f" Target OS : {args.os}")
print(f" Engine Version : {args.engine_version}")
if args.dry_run:
print("\nDry run complete (no submission performed).")
return
local_hostname = socket.gethostname()
local_hostname = local_hostname + (".local" if not local_hostname.endswith(".local") else "")
found_proxy = ServerProxyManager.get_proxy_for_hostname(local_hostname)
is_connected = found_proxy.check_connection()
adhoc_server = None
if not is_connected:
adhoc_server = ZordonServer()
adhoc_server.start_server()
found_proxy = ServerProxyManager.get_proxy_for_hostname(adhoc_server.server_hostname)
while not is_connected:
# todo: add timeout
is_connected = found_proxy.check_connection()
time.sleep(1)
new_job = {"name": job_name, "engine_name": args.engine}
try:
response = found_proxy.post_job_to_server(file_path, new_job)
except Exception as e:
print(f"Error creating job: {e}")
exit(1)
if response and response.ok:
print(f"Uploaded to {found_proxy.hostname} successfully!")
running_job_data = response.json()
job_id = running_job_data.get('id')
print(f"Job {job_id} Summary:")
print(f" Status : {running_job_data.get('status')}")
print(f" Engine : {running_job_data.get('engine_name')}-{running_job_data.get('engine_version')}")
print("\nWaiting for render to complete...")
percent_complete = 0.0
while percent_complete < 1.0:
# add checks for errors
time.sleep(1)
running_job_data = found_proxy.get_job_info(job_id)
percent_complete = running_job_data['percent_complete']
sys.stdout.write("\x1b[1A") # Move up 1
sys.stdout.write("\x1b[0J") # Clear from cursor to end of screen (optional)
print(f"Percent Complete: {percent_complete:.2%}")
sys.stdout.flush()
print("Finished rendering successfully!")
else:
print(f"Failed to upload job. {response.text} !")
if adhoc_server:
adhoc_server.stop_server()
if __name__ == "__main__":
main()

View File

@@ -1,77 +0,0 @@
#!/usr/bin/env python3
import logging
import threading
from collections import deque
from server import ZordonServer
logger = logging.getLogger()
def __setup_buffer_handler():
# lazy load GUI frameworks
from PyQt6.QtCore import QObject, pyqtSignal
class BufferingHandler(logging.Handler, QObject):
new_record = pyqtSignal(str)
flushOnClose = True
def __init__(self, capacity=100):
logging.Handler.__init__(self)
QObject.__init__(self)
self.buffer = deque(maxlen=capacity) # Define a buffer with a fixed capacity
def emit(self, record):
try:
msg = self.format(record)
self.buffer.append(msg) # Add message to the buffer
self.new_record.emit(msg) # Emit signal
except RuntimeError:
pass
def get_buffer(self):
return list(self.buffer) # Return a copy of the buffer
buffer_handler = BufferingHandler()
buffer_handler.setFormatter(logging.getLogger().handlers[0].formatter)
new_logger = logging.getLogger()
new_logger.addHandler(buffer_handler)
return buffer_handler
def __show_gui(buffer_handler):
# lazy load GUI frameworks
from PyQt6.QtWidgets import QApplication
# load application
app: QApplication = QApplication(sys.argv)
if app.style().objectName() != 'macos':
app.setStyle('Fusion')
# configure main window
from src.ui.main_window import MainWindow
window: MainWindow = MainWindow()
window.buffer_handler = buffer_handler
window.show()
exit_code = app.exec()
# cleanup: remove and close the GUI logging handler before interpreter shutdown
root_logger = logging.getLogger()
if buffer_handler in root_logger.handlers:
root_logger.removeHandler(buffer_handler)
try:
buffer_handler.close()
except Exception:
# never let logging cleanup throw during shutdown
pass
return exit_code
if __name__ == '__main__':
import sys
server = ZordonServer()
server.start_server()
__show_gui(__setup_buffer_handler())
server.stop_server()
sys.exit()

View File

@@ -1,158 +0,0 @@
# -*- mode: python ; coding: utf-8 -*-
from PyInstaller.utils.hooks import collect_all
from pathlib import Path
import os
import sys
import platform
# ------------------------------------------------------------
# Project paths
# ------------------------------------------------------------
project_root = Path(SPECPATH).resolve()
src_dir = project_root / "src"
# Ensure `src.*` imports work during analysis
sys.path.insert(0, str(project_root))
# ------------------------------------------------------------
# Import version info
# ------------------------------------------------------------
from src.version import APP_NAME, APP_VERSION, APP_AUTHOR
APP_NAME = f"{APP_NAME}-client"
# ------------------------------------------------------------
# PyInstaller data / imports
# ------------------------------------------------------------
datas = [
("resources", "resources"),
("src/engines/blender/scripts", "src/engines/blender/scripts"),
]
binaries = []
hiddenimports = ["zeroconf", "src.version"]
tmp_ret = collect_all("zeroconf")
datas += tmp_ret[0]
binaries += tmp_ret[1]
hiddenimports += tmp_ret[2]
# ------------------------------------------------------------
# Analysis
# ------------------------------------------------------------
a = Analysis(
["client.py"],
pathex=[str(project_root)],
binaries=binaries,
datas=datas,
hiddenimports=hiddenimports,
hookspath=[],
hooksconfig={},
runtime_hooks=[],
excludes=[],
noarchive=False,
optimize=1, # optimize=2 breaks Windows builds
)
pyz = PYZ(a.pure)
# ------------------------------------------------------------
# Platform targets
# ------------------------------------------------------------
if platform.system() == "Darwin": # macOS
exe = EXE(
pyz,
a.scripts,
[],
exclude_binaries=True,
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
)
app = BUNDLE(
exe,
a.binaries,
a.datas,
strip=True,
name=f"{APP_NAME}.app",
icon="resources/Server.png",
bundle_identifier=None,
version=APP_VERSION,
)
elif platform.system() == "Windows":
import pyinstaller_versionfile
import tempfile
version_file_path = os.path.join(
tempfile.gettempdir(), "versionfile.txt"
)
pyinstaller_versionfile.create_versionfile(
output_file=version_file_path,
version=APP_VERSION,
company_name=APP_AUTHOR,
file_description=APP_NAME,
internal_name=APP_NAME,
legal_copyright=f"© {APP_AUTHOR}",
original_filename=f"{APP_NAME}.exe",
product_name=APP_NAME,
)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.datas,
[],
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
version=version_file_path,
)
else: # Linux
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.datas,
[],
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
)

View File

@@ -3,7 +3,7 @@ update_engines_on_launch: true
max_content_path: 100000000
server_log_level: info
log_buffer_length: 250
worker_process_timeout: 120
subjob_connection_timeout: 120
flask_log_level: error
flask_debug_enable: false
queue_eval_seconds: 1

Binary file not shown.

Before

Width:  |  Height:  |  Size: 838 KiB

6
main.py Executable file
View File

@@ -0,0 +1,6 @@
#!/usr/bin/env python3
from src import init
if __name__ == '__main__':
import sys
sys.exit(init.run())

View File

@@ -1,2 +0,0 @@
[pytest]
norecursedirs = src/engines/aerender .git build dist *.egg venv .venv env .env __pycache__ .pytest_cache

View File

@@ -1,20 +1,15 @@
PyQt6>=6.7.0
psutil>=5.9.8
requests>=2.32.2
Pillow>=10.3.0
PyYAML>=6.0.1
flask>=3.0.3
tqdm>=4.66.4
werkzeug>=3.0.3
Pypubsub>=4.0.3
zeroconf>=0.132.2
SQLAlchemy>=2.0.30
plyer>=2.1.0
rich>=13.7.1
setuptools>=70.0.0
py-cpuinfo>=9.0.0
requests-toolbelt>=1.0.0
PyQt6-sip>=13.6.0
humanize>=4.12.1
macholib>=1.16.3
altgraph>=0.17.4
requests==2.31.0
psutil==5.9.6
PyYAML==6.0.1
Flask==3.0.0
rich==13.6.0
Werkzeug~=3.0.1
json2html~=1.3.0
SQLAlchemy~=2.0.15
Pillow==10.1.0
zeroconf==0.119.0
Pypubsub~=4.0.3
tqdm==4.66.1
plyer==2.1.0
PyQt6~=6.6.0
PySide6~=6.6.0

View File

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

Before

Width:  |  Height:  |  Size: 2.0 KiB

After

Width:  |  Height:  |  Size: 2.0 KiB

View File

Before

Width:  |  Height:  |  Size: 6.1 KiB

After

Width:  |  Height:  |  Size: 6.1 KiB

View File

Before

Width:  |  Height:  |  Size: 921 B

After

Width:  |  Height:  |  Size: 921 B

View File

Before

Width:  |  Height:  |  Size: 476 B

After

Width:  |  Height:  |  Size: 476 B

View File

Before

Width:  |  Height:  |  Size: 979 B

After

Width:  |  Height:  |  Size: 979 B

View File

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

View File

Before

Width:  |  Height:  |  Size: 4.7 KiB

After

Width:  |  Height:  |  Size: 4.7 KiB

View File

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

View File

Before

Width:  |  Height:  |  Size: 450 B

After

Width:  |  Height:  |  Size: 450 B

View File

Before

Width:  |  Height:  |  Size: 1.3 KiB

After

Width:  |  Height:  |  Size: 1.3 KiB

View File

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 2.5 KiB

View File

Before

Width:  |  Height:  |  Size: 694 B

After

Width:  |  Height:  |  Size: 694 B

View File

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

Before

Width:  |  Height:  |  Size: 1.8 KiB

After

Width:  |  Height:  |  Size: 1.8 KiB

View File

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

Before

Width:  |  Height:  |  Size: 816 B

After

Width:  |  Height:  |  Size: 816 B

View File

Before

Width:  |  Height:  |  Size: 1.7 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

View File

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

Before

Width:  |  Height:  |  Size: 806 B

After

Width:  |  Height:  |  Size: 806 B

126
server.py
View File

@@ -1,125 +1,5 @@
import logging
import multiprocessing
import os
import socket
import threading
from pathlib import Path
import psutil
from src.api.api_server import API_VERSION
from src.api.api_server import start_api_server
from src.api.preview_manager import PreviewManager
from src.api.serverproxy_manager import ServerProxyManager
from src.distributed_job_manager import DistributedJobManager
from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue
from src.utilities.config import Config
from src.utilities.misc_helper import (get_gpu_info, current_system_cpu, current_system_os,
current_system_os_version, current_system_cpu_brand)
from src.utilities.zeroconf_server import ZeroconfServer
from src.version import APP_NAME, APP_VERSION
logger = logging.getLogger()
class ZordonServer:
def __init__(self):
# setup logging
logging.basicConfig(format='%(asctime)s: %(levelname)s: %(module)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S',
level=Config.server_log_level.upper())
logging.getLogger("requests").setLevel(logging.WARNING) # suppress noisy requests/urllib3 logging
logging.getLogger("urllib3").setLevel(logging.WARNING)
# Load Config YAML
Config.setup_config_dir()
config_path = Path(Config.config_dir()) / "config.yaml"
Config.load_config(config_path)
# configure default paths
EngineManager.engines_path = str(Path(Config.upload_folder).expanduser()/ "engines")
os.makedirs(EngineManager.engines_path, exist_ok=True)
PreviewManager.storage_path = Path(Config.upload_folder).expanduser() / "previews"
self.api_server = None
self.server_hostname: str = socket.gethostname()
def start_server(self):
def existing_process(process_name):
current_pid = os.getpid()
current_process = psutil.Process(current_pid)
for proc in psutil.process_iter(['pid', 'name', 'ppid']):
proc_name = proc.info['name'].lower().rstrip('.exe')
if proc_name == process_name.lower() and proc.info['pid'] != current_pid:
if proc.info['pid'] == current_process.ppid():
continue # parent process
elif proc.info['ppid'] == current_pid:
continue # child process
else:
return proc # unrelated process
return None
# check for existing instance
existing_proc = existing_process(APP_NAME)
if existing_proc:
err_msg = f"Another instance of {APP_NAME} is already running (pid: {existing_proc.pid})"
logger.fatal(err_msg)
raise ProcessLookupError(err_msg)
# main start
logger.info(f"Starting {APP_NAME} Render Server ({APP_VERSION})")
logger.debug(f"Upload directory: {Path(Config.upload_folder).expanduser()}")
logger.debug(f"Thumbs directory: {PreviewManager.storage_path}")
logger.debug(f"Engines directory: {EngineManager.engines_path}")
# Set up the RenderQueue object
RenderQueue.load_state(database_directory=Path(Config.upload_folder).expanduser())
ServerProxyManager.subscribe_to_listener()
DistributedJobManager.subscribe_to_listener()
# update hostname
self.server_hostname = socket.gethostname()
# configure and start API server
self.api_server = threading.Thread(target=start_api_server, args=(self.server_hostname,))
self.api_server.daemon = True
self.api_server.start()
# start zeroconf server
ZeroconfServer.configure(f"_{APP_NAME.lower()}._tcp.local.", self.server_hostname, Config.port_number)
ZeroconfServer.properties = {'system_cpu': current_system_cpu(),
'system_cpu_brand': current_system_cpu_brand(),
'system_cpu_cores': multiprocessing.cpu_count(),
'system_os': current_system_os(),
'system_os_version': current_system_os_version(),
'system_memory': round(psutil.virtual_memory().total / (1024**3)), # in GB
'gpu_info': get_gpu_info(),
'api_version': API_VERSION}
ZeroconfServer.start()
logger.info(f"{APP_NAME} Render Server started - Hostname: {self.server_hostname}")
RenderQueue.start() # Start evaluating the render queue
def is_running(self):
return self.api_server and self.api_server.is_alive()
def stop_server(self):
logger.info(f"{APP_NAME} Render Server is preparing to stop")
try:
ZeroconfServer.stop()
RenderQueue.prepare_for_shutdown()
except Exception as e:
logger.exception(f"Exception during prepare for shutdown: {e}")
logger.info(f"{APP_NAME} Render Server has shut down")
#!/usr/bin/env python3
from src.api.api_server import start_server
if __name__ == '__main__':
server = ZordonServer()
try:
server.start_server()
server.api_server.join()
except KeyboardInterrupt:
pass
except Exception as e:
logger.fatal(f"Unhandled exception: {e}")
finally:
server.stop_server()
start_server()

View File

@@ -1,158 +0,0 @@
# -*- mode: python ; coding: utf-8 -*-
from PyInstaller.utils.hooks import collect_all
from pathlib import Path
import os
import sys
import platform
# ------------------------------------------------------------
# Project paths
# ------------------------------------------------------------
project_root = Path(SPECPATH).resolve()
src_dir = project_root / "src"
# Ensure `src.*` imports work during analysis
sys.path.insert(0, str(project_root))
# ------------------------------------------------------------
# Import version info
# ------------------------------------------------------------
from src.version import APP_NAME, APP_VERSION, APP_AUTHOR
APP_NAME = f"{APP_NAME}-server"
# ------------------------------------------------------------
# PyInstaller data / imports
# ------------------------------------------------------------
datas = [
("resources", "resources"),
("src/engines/blender/scripts", "src/engines/blender/scripts"),
]
binaries = []
hiddenimports = ["zeroconf", "src.version"]
tmp_ret = collect_all("zeroconf")
datas += tmp_ret[0]
binaries += tmp_ret[1]
hiddenimports += tmp_ret[2]
# ------------------------------------------------------------
# Analysis
# ------------------------------------------------------------
a = Analysis(
["server.py"],
pathex=[str(project_root)],
binaries=binaries,
datas=datas,
hiddenimports=hiddenimports,
hookspath=[],
hooksconfig={},
runtime_hooks=[],
excludes=[],
noarchive=False,
optimize=1, # optimize=2 breaks Windows builds
)
pyz = PYZ(a.pure)
# ------------------------------------------------------------
# Platform targets
# ------------------------------------------------------------
if platform.system() == "Darwin": # macOS
exe = EXE(
pyz,
a.scripts,
[],
exclude_binaries=True,
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
)
app = BUNDLE(
exe,
a.binaries,
a.datas,
strip=True,
name=f"{APP_NAME}.app",
icon="resources/Server.png",
bundle_identifier=None,
version=APP_VERSION,
)
elif platform.system() == "Windows":
import pyinstaller_versionfile
import tempfile
version_file_path = os.path.join(
tempfile.gettempdir(), "versionfile.txt"
)
pyinstaller_versionfile.create_versionfile(
output_file=version_file_path,
version=APP_VERSION,
company_name=APP_AUTHOR,
file_description=APP_NAME,
internal_name=APP_NAME,
legal_copyright=f"© {APP_AUTHOR}",
original_filename=f"{APP_NAME}.exe",
product_name=APP_NAME,
)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.datas,
[],
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
version=version_file_path,
)
else: # Linux
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.datas,
[],
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
)

179
src/api/add_job_helpers.py Normal file
View File

@@ -0,0 +1,179 @@
#!/usr/bin/env python3
import logging
import os
import shutil
import tempfile
import zipfile
from datetime import datetime
import requests
from tqdm import tqdm
from werkzeug.utils import secure_filename
from src.distributed_job_manager import DistributedJobManager
from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue
logger = logging.getLogger()
def handle_uploaded_project_files(request, jobs_list, upload_directory):
# Initialize default values
loaded_project_local_path = None
uploaded_project = request.files.get('file', None)
project_url = jobs_list[0].get('url', None)
local_path = jobs_list[0].get('local_path', None)
renderer = jobs_list[0].get('renderer')
downloaded_file_url = None
if uploaded_project and uploaded_project.filename:
referred_name = os.path.basename(uploaded_project.filename)
elif project_url:
referred_name, downloaded_file_url = download_project_from_url(project_url)
if not referred_name:
raise ValueError(f"Error downloading file from URL: {project_url}")
elif local_path and os.path.exists(local_path):
referred_name = os.path.basename(local_path)
else:
raise ValueError("Cannot find any valid project paths")
# Prepare the local filepath
cleaned_path_name = os.path.splitext(referred_name)[0].replace(' ', '_')
job_dir = os.path.join(upload_directory, '-'.join(
[datetime.now().strftime("%Y.%m.%d_%H.%M.%S"), renderer, cleaned_path_name]))
os.makedirs(job_dir, exist_ok=True)
project_source_dir = os.path.join(job_dir, 'source')
os.makedirs(project_source_dir, exist_ok=True)
# Move projects to their work directories
if uploaded_project and uploaded_project.filename:
loaded_project_local_path = os.path.join(project_source_dir, secure_filename(uploaded_project.filename))
uploaded_project.save(loaded_project_local_path)
logger.info(f"Transfer complete for {loaded_project_local_path.split(upload_directory)[-1]}")
elif project_url:
loaded_project_local_path = os.path.join(project_source_dir, referred_name)
shutil.move(downloaded_file_url, loaded_project_local_path)
logger.info(f"Download complete for {loaded_project_local_path.split(upload_directory)[-1]}")
elif local_path:
loaded_project_local_path = os.path.join(project_source_dir, referred_name)
shutil.copy(local_path, loaded_project_local_path)
logger.info(f"Import complete for {loaded_project_local_path.split(upload_directory)[-1]}")
return loaded_project_local_path, referred_name
def download_project_from_url(project_url):
# This nested function is to handle downloading from a URL
logger.info(f"Downloading project from url: {project_url}")
referred_name = os.path.basename(project_url)
downloaded_file_url = None
try:
response = requests.get(project_url, stream=True)
if response.status_code == 200:
# Get the total file size from the "Content-Length" header
file_size = int(response.headers.get("Content-Length", 0))
# Create a progress bar using tqdm
progress_bar = tqdm(total=file_size, unit="B", unit_scale=True)
# Open a file for writing in binary mode
downloaded_file_url = os.path.join(tempfile.gettempdir(), referred_name)
with open(downloaded_file_url, "wb") as file:
for chunk in response.iter_content(chunk_size=1024):
if chunk:
# Write the chunk to the file
file.write(chunk)
# Update the progress bar
progress_bar.update(len(chunk))
# Close the progress bar
progress_bar.close()
return referred_name, downloaded_file_url
except Exception as e:
logger.error(f"Error downloading file: {e}")
return None, None
def process_zipped_project(zip_path):
# Given a zip path, extract its content, and return the main project file path
work_path = os.path.dirname(zip_path)
try:
with zipfile.ZipFile(zip_path, 'r') as myzip:
myzip.extractall(work_path)
project_files = [x for x in os.listdir(work_path) if os.path.isfile(os.path.join(work_path, x))]
project_files = [x for x in project_files if '.zip' not in x]
logger.debug(f"Zip files: {project_files}")
# supported_exts = RenderWorkerFactory.class_for_name(renderer).engine.supported_extensions
# if supported_exts:
# project_files = [file for file in project_files if any(file.endswith(ext) for ext in supported_exts)]
# If there's more than 1 project file or none, raise an error
if len(project_files) != 1:
raise ValueError(f'Cannot find a valid project file in {os.path.basename(zip_path)}')
extracted_project_path = os.path.join(work_path, project_files[0])
logger.info(f"Extracted zip file to {extracted_project_path}")
except (zipfile.BadZipFile, zipfile.LargeZipFile) as e:
logger.error(f"Error processing zip file: {e}")
raise ValueError(f"Error processing zip file: {e}")
return extracted_project_path
def create_render_jobs(jobs_list, loaded_project_local_path, job_dir):
results = []
for job_data in jobs_list:
try:
# get new output path in output_dir
output_path = job_data.get('output_path')
if not output_path:
loaded_project_filename = os.path.basename(loaded_project_local_path)
output_filename = os.path.splitext(loaded_project_filename)[0]
else:
output_filename = os.path.basename(output_path)
# Prepare output path
output_dir = os.path.join(os.path.dirname(os.path.dirname(loaded_project_local_path)), 'output')
output_path = os.path.join(output_dir, output_filename)
os.makedirs(output_dir, exist_ok=True)
logger.debug(f"New job output path: {output_path}")
# create & configure jobs
worker = EngineManager.create_worker(renderer=job_data['renderer'],
input_path=loaded_project_local_path,
output_path=output_path,
engine_version=job_data.get('engine_version'),
args=job_data.get('args', {}))
worker.status = job_data.get("initial_status", worker.status)
worker.parent = job_data.get("parent", worker.parent)
worker.name = job_data.get("name", worker.name)
worker.priority = int(job_data.get('priority', worker.priority))
worker.start_frame = int(job_data.get("start_frame", worker.start_frame))
worker.end_frame = int(job_data.get("end_frame", worker.end_frame))
# determine if we can / should split the job
if job_data.get("enable_split_jobs", False) and (worker.total_frames > 1) and not worker.parent:
DistributedJobManager.split_into_subjobs(worker, job_data, loaded_project_local_path)
else:
logger.debug("Not splitting into subjobs")
RenderQueue.add_to_render_queue(worker, force_start=job_data.get('force_start', False))
if not worker.parent:
from src.api.api_server import make_job_ready
make_job_ready(worker.id)
results.append(worker.json())
except FileNotFoundError as e:
err_msg = f"Cannot create job: {e}"
logger.error(err_msg)
results.append({'error': err_msg})
except Exception as e:
err_msg = f"Exception creating render job: {e}"
logger.exception(err_msg)
results.append({'error': err_msg})
return results

View File

@@ -1,100 +1,158 @@
#!/usr/bin/env python3
import concurrent.futures
import json
import logging
import multiprocessing
import os
import pathlib
import shutil
import socket
import ssl
import tempfile
import threading
import time
import traceback
from datetime import datetime
from pathlib import Path
from typing import Dict, Any, Optional
from zipfile import ZipFile
import cpuinfo
import json2html
import psutil
import yaml
from flask import Flask, request, send_file, after_this_request, Response, redirect, url_for
from sqlalchemy.orm.exc import DetachedInstanceError
from flask import Flask, request, render_template, send_file, after_this_request, Response, redirect, url_for, abort
from src.api.job_import_handler import JobImportHandler
from src.api.preview_manager import PreviewManager
from src.api.add_job_helpers import handle_uploaded_project_files, process_zipped_project, create_render_jobs
from src.api.serverproxy_manager import ServerProxyManager
from src.distributed_job_manager import DistributedJobManager
from src.engines.core.base_worker import string_to_status, RenderStatus
from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue, JobNotFoundError
from src.utilities.config import Config
from src.utilities.misc_helper import current_system_os, current_system_cpu, \
current_system_os_version, num_to_alphanumeric, get_gpu_info
from src.utilities.status_utils import string_to_status
from src.version import APP_VERSION
from src.utilities.misc_helper import system_safe_path, current_system_os, current_system_cpu, \
current_system_os_version, config_dir
from src.utilities.server_helper import generate_thumbnail_for_job
from src.utilities.zeroconf_server import ZeroconfServer
logger = logging.getLogger()
server = Flask(__name__)
server = Flask(__name__, template_folder='web/templates', static_folder='web/static')
ssl._create_default_https_context = ssl._create_unverified_context # disable SSL for downloads
API_VERSION = "0.1"
def start_api_server(hostname: Optional[str] = None) -> None:
# get hostname
if not hostname:
local_hostname = socket.gethostname()
hostname = local_hostname + (".local" if not local_hostname.endswith(".local") else "")
# load flask settings
server.config['HOSTNAME'] = hostname
server.config['PORT'] = int(Config.port_number)
server.config['UPLOAD_FOLDER'] = str(Path(Config.upload_folder).expanduser())
server.config['MAX_CONTENT_PATH'] = Config.max_content_path
server.config['enable_split_jobs'] = Config.enable_split_jobs
# disable most Flask logging
flask_log = logging.getLogger('werkzeug')
flask_log.setLevel(Config.flask_log_level.upper())
logger.debug('Starting API server')
try:
server.run(host=hostname, port=server.config['PORT'], debug=Config.flask_debug_enable, use_reloader=False,
threaded=True)
finally:
logger.debug('Stopping API server')
categories = [RenderStatus.RUNNING, RenderStatus.ERROR, RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED,
RenderStatus.COMPLETED, RenderStatus.CANCELLED]
# --------------------------------------------
# Get All Jobs
# --------------------------------------------
def sorted_jobs(all_jobs, sort_by_date=True):
if not sort_by_date:
sorted_job_list = []
if all_jobs:
for status_category in categories:
found_jobs = [x for x in all_jobs if x.status == status_category.value]
if found_jobs:
sorted_found_jobs = sorted(found_jobs, key=lambda d: d.date_created, reverse=True)
sorted_job_list.extend(sorted_found_jobs)
else:
sorted_job_list = sorted(all_jobs, key=lambda d: d.date_created, reverse=True)
return sorted_job_list
@server.route('/')
@server.route('/index')
def index():
with open(system_safe_path(os.path.join(config_dir(), 'presets.yaml'))) as f:
render_presets = yaml.load(f, Loader=yaml.FullLoader)
return render_template('index.html', all_jobs=sorted_jobs(RenderQueue.all_jobs()),
hostname=server.config['HOSTNAME'], renderer_info=renderer_info(),
render_clients=[server.config['HOSTNAME']], preset_list=render_presets)
@server.get('/api/jobs')
def jobs_json() -> Dict[str, Any]:
"""Retrieves all jobs from the render queue in JSON format.
def jobs_json():
try:
hash_token = request.args.get('token', None)
all_jobs = [x.json() for x in RenderQueue.all_jobs()]
job_cache_token = str(json.dumps(all_jobs).__hash__())
This endpoint fetches all jobs currently in the render queue, converts them to JSON format,
and returns them along with a cache token that represents the current state of the job list.
Returns:
dict: A dictionary containing:
- 'jobs' (list[dict]): A list of job dictionaries, each representing a job in the queue.
- 'token' (str): A cache token generated from the hash of the job list.
"""
all_jobs = [x.json() for x in RenderQueue.all_jobs()]
job_cache_int = int(json.dumps(all_jobs).__hash__())
job_cache_token = num_to_alphanumeric(job_cache_int)
return {'jobs': all_jobs, 'token': job_cache_token}
if hash_token and hash_token == job_cache_token:
return [], 204 # no need to update
else:
return {'jobs': all_jobs, 'token': job_cache_token}
except Exception as e:
logger.exception(f"Exception fetching all_jobs_cached: {e}")
return [], 500
@server.get('/api/jobs_long_poll')
def long_polling_jobs():
hash_token = request.args.get('token', None)
start_time = time.time()
while True:
all_jobs = jobs_json()
if all_jobs['token'] != hash_token:
return all_jobs
# Break after 30 seconds to avoid gateway timeout
if time.time() - start_time > 30:
return {}, 204
time.sleep(1)
@server.route('/ui/job/<job_id>/full_details')
def job_detail(job_id):
found_job = RenderQueue.job_with_id(job_id)
table_html = json2html.json2html.convert(json=found_job.json(),
table_attributes='class="table is-narrow is-striped is-fullwidth"')
media_url = None
if found_job.file_list() and found_job.status == RenderStatus.COMPLETED:
media_basename = os.path.basename(found_job.file_list()[0])
media_url = f"/api/job/{job_id}/file/{media_basename}"
return render_template('details.html', detail_table=table_html, media_url=media_url,
hostname=server.config['HOSTNAME'], job_status=found_job.status.value.title(),
job=found_job, renderer_info=renderer_info())
@server.route('/api/job/<job_id>/thumbnail')
def job_thumbnail(job_id):
big_thumb = request.args.get('size', False) == "big"
video_ok = request.args.get('video_ok', False)
found_job = RenderQueue.job_with_id(job_id, none_ok=True)
if found_job:
os.makedirs(server.config['THUMBS_FOLDER'], exist_ok=True)
thumb_video_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '.mp4')
thumb_image_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '.jpg')
big_video_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '_big.mp4')
big_image_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '_big.jpg')
# generate regular thumb if it doesn't exist
if not os.path.exists(thumb_video_path) and not os.path.exists(thumb_video_path + '_IN-PROGRESS') and \
found_job.status not in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
generate_thumbnail_for_job(found_job, thumb_video_path, thumb_image_path, max_width=240)
# generate big thumb if it doesn't exist
if not os.path.exists(big_video_path) and not os.path.exists(big_image_path + '_IN-PROGRESS') and \
found_job.status not in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
generate_thumbnail_for_job(found_job, big_video_path, big_image_path, max_width=800)
# generated videos
if video_ok:
if big_thumb and os.path.exists(big_video_path) and not os.path.exists(
big_video_path + '_IN-PROGRESS'):
return send_file(big_video_path, mimetype="video/mp4")
elif os.path.exists(thumb_video_path) and not os.path.exists(thumb_video_path + '_IN-PROGRESS'):
return send_file(thumb_video_path, mimetype="video/mp4")
# Generated thumbs
if big_thumb and os.path.exists(big_image_path):
return send_file(big_image_path, mimetype='image/jpeg')
elif os.path.exists(thumb_image_path):
return send_file(thumb_image_path, mimetype='image/jpeg')
# Misc status icons
if found_job.status == RenderStatus.RUNNING:
return send_file('../web/static/images/gears.png', mimetype="image/png")
elif found_job.status == RenderStatus.CANCELLED:
return send_file('../web/static/images/cancelled.png', mimetype="image/png")
elif found_job.status == RenderStatus.SCHEDULED:
return send_file('../web/static/images/scheduled.png', mimetype="image/png")
elif found_job.status == RenderStatus.NOT_STARTED:
return send_file('../web/static/images/not_started.png', mimetype="image/png")
# errors
return send_file('../web/static/images/error.png', mimetype="image/png")
# Get job file routing
@server.route('/api/job/<job_id>/file/<filename>', methods=['GET'])
def get_job_file(job_id, filename):
found_job = RenderQueue.job_with_id(job_id)
try:
for full_path in found_job.file_list():
if filename in full_path:
return send_file(path_or_file=full_path)
except FileNotFoundError:
abort(404)
@server.get('/api/jobs/<status_val>')
@@ -107,37 +165,33 @@ def filtered_jobs_json(status_val):
return f'Cannot find jobs with status {status_val}', 400
# --------------------------------------------
# Job Details / File Handling
# --------------------------------------------
@server.post('/api/job/<job_id>/notify_parent_of_status_change')
def subjob_status_change(job_id):
try:
subjob_details = request.json
logger.info(f"Subjob to job id: {job_id} is now {subjob_details['status']}")
DistributedJobManager.handle_subjob_status_change(RenderQueue.job_with_id(job_id), subjob_data=subjob_details)
return Response(status=200)
except JobNotFoundError:
return "Job not found", 404
@server.errorhandler(JobNotFoundError)
def handle_job_not_found(job_error):
return f'Cannot find job with ID {job_error.job_id}', 400
@server.get('/api/job/<job_id>')
def get_job_details(job_id):
"""Retrieves the details of a requested job in JSON format
Args:
job_id (str): The ID of the render job.
Returns:
dict: A JSON representation of the job's details.
"""
def get_job_status(job_id):
return RenderQueue.job_with_id(job_id).json()
@server.get('/api/job/<job_id>/logs')
def get_job_logs(job_id):
"""Retrieves the log file for a specific render job.
Args:
job_id (str): The ID of the render job.
Returns:
Response: The log file's content as plain text, or an empty response if the log file is not found.
"""
found_job = RenderQueue.job_with_id(job_id)
log_path = Path(found_job.log_path())
log_path = system_safe_path(found_job.log_path())
log_data = None
if log_path and log_path.exists():
if log_path and os.path.exists(log_path):
with open(log_path) as file:
log_data = file.read()
return Response(log_data, mimetype='text/plain')
@@ -145,64 +199,57 @@ def get_job_logs(job_id):
@server.get('/api/job/<job_id>/file_list')
def get_file_list(job_id):
return [Path(p).name for p in RenderQueue.job_with_id(job_id).file_list()]
return RenderQueue.job_with_id(job_id).file_list()
@server.route('/api/job/<job_id>/download')
def download_requested_file(job_id):
requested_filename = request.args.get("filename")
if not requested_filename:
return "Filename required", 400
found_job = RenderQueue.job_with_id(job_id)
for job_file in found_job.file_list():
p = Path(job_file)
if p.name.lower() == requested_filename.lower():
return send_file(str(p), as_attachment=True)
return f"File '{requested_filename}' not found", 404
@server.get('/api/job/<job_id>/make_ready')
def make_job_ready(job_id):
try:
found_job = RenderQueue.job_with_id(job_id)
if found_job.status in [RenderStatus.CONFIGURING, RenderStatus.NOT_STARTED]:
if found_job.children:
for child_key in found_job.children.keys():
child_id = child_key.split('@')[0]
hostname = child_key.split('@')[-1]
ServerProxyManager.get_proxy_for_hostname(hostname).request_data(f'job/{child_id}/make_ready')
found_job.status = RenderStatus.NOT_STARTED
RenderQueue.save_state()
return found_job.json(), 200
except Exception as e:
return "Error making job ready: {e}", 500
return "Not valid command", 405
@server.route('/api/job/<job_id>/download_all')
def download_all_files(job_id):
def download_all(job_id):
zip_filename = None
@after_this_request
def clear_zip(response):
if zip_filename and zip_filename.exists():
try:
zip_filename.unlink()
except Exception as e:
logger.warning(f"Error removing zip file '{zip_filename}': {e}")
if zip_filename and os.path.exists(zip_filename):
os.remove(zip_filename)
return response
found_job = RenderQueue.job_with_id(job_id)
output_dir = os.path.dirname(found_job.output_path)
if os.path.exists(output_dir):
zip_filename = system_safe_path(os.path.join(tempfile.gettempdir(),
pathlib.Path(found_job.input_path).stem + '.zip'))
with ZipFile(zip_filename, 'w') as zipObj:
for f in os.listdir(output_dir):
zipObj.write(filename=system_safe_path(os.path.join(output_dir, f)),
arcname=os.path.basename(f))
return send_file(zip_filename, mimetype="zip", as_attachment=True, )
else:
return f'Cannot find project files for job {job_id}', 500
output_dir = Path(found_job.output_path).parent
if not output_dir.exists():
return f"Cannot find project files for job {job_id}", 500
zip_filename = Path(tempfile.gettempdir()) / f"{Path(found_job.input_path).stem}.zip"
from zipfile import ZipFile
with ZipFile(zip_filename, "w") as zipObj:
for f in output_dir.iterdir():
if f.is_file():
zipObj.write(f, arcname=f.name)
return send_file(str(zip_filename), mimetype="zip", as_attachment=True)
# --------------------------------------------
# System Environment / Status
# --------------------------------------------
@server.get('/api/presets')
def presets() -> Dict[str, Any]:
presets_path = Path('config/presets.yaml')
def presets():
presets_path = system_safe_path('config/presets.yaml')
with open(presets_path) as f:
loaded_presets = yaml.load(f, Loader=yaml.FullLoader)
return loaded_presets
presets = yaml.load(f, Loader=yaml.FullLoader)
return presets
@server.get('/api/full_status')
@@ -228,81 +275,55 @@ def snapshot():
return server_data
@server.route('/api/status')
def status():
return {"timestamp": datetime.now().isoformat(),
"system_os": current_system_os(),
"system_os_version": current_system_os_version(),
"system_cpu": current_system_cpu(),
"system_cpu_brand": cpuinfo.get_cpu_info()['brand_raw'],
"cpu_percent": psutil.cpu_percent(percpu=False),
"cpu_percent_per_cpu": psutil.cpu_percent(percpu=True),
"cpu_count": psutil.cpu_count(logical=False),
"memory_total": psutil.virtual_memory().total,
"memory_available": psutil.virtual_memory().available,
"memory_percent": psutil.virtual_memory().percent,
"job_counts": RenderQueue.job_counts(),
"hostname": server.config['HOSTNAME'],
"port": server.config['PORT'],
"app_version": APP_VERSION,
"api_version": API_VERSION
}
@server.get('/api/_detected_clients')
def detected_clients():
# todo: dev/debug only. Should not ship this - probably.
return ZeroconfServer.found_hostnames()
# --------------------------------------------
# Job Lifecyle (Create, Cancel, Delete)
# --------------------------------------------
# New version
@server.post('/api/add_job')
def add_job_handler():
"""
POST /api/add_job
Add a render job to the queue.
**Request Formats**
- JSON body:
{
"name": "example.blend",
"engine": "blender",
"frame_start": 1,
"frame_end": 100,
"render_settings": {...}
"child_jobs"; [...]
}
**Responses**
200 Success
400 Invalid or missing input
500 Internal server error while parsing or creating jobs
"""
# Process request data
try:
if request.is_json:
new_job_data = request.get_json()
jobs_list = [request.json] if not isinstance(request.json, list) else request.json
elif request.form.get('json', None):
new_job_data = json.loads(request.form['json'])
jobs_list = json.loads(request.form['json'])
else:
return "Cannot find valid job data", 400
# Cleanup flat form data into nested structure
form_dict = {k: v for k, v in dict(request.form).items() if v}
args = {}
arg_keys = [k for k in form_dict.keys() if '-arg_' in k]
for server_hostname in arg_keys:
if form_dict['renderer'] in server_hostname or 'AnyRenderer' in server_hostname:
cleaned_key = server_hostname.split('-arg_')[-1]
args[cleaned_key] = form_dict[server_hostname]
form_dict.pop(server_hostname)
args['raw'] = form_dict.get('raw_args', None)
form_dict['args'] = args
jobs_list = [form_dict]
except Exception as e:
err_msg = f"Error processing job data: {e}"
logger.error(err_msg)
return err_msg, 500
# Validate Job Data - check for required values and download or unzip project files
try:
processed_job_data = JobImportHandler.validate_job_data(new_job_data, server.config['UPLOAD_FOLDER'],
uploaded_file=request.files.get('file'))
except (KeyError, FileNotFoundError) as e:
err_msg = f"Error processing job data: {e}"
return err_msg, 400
except Exception as e:
traceback.print_exception(e)
err_msg = f"Unknown error processing data: {e}"
return err_msg, 500
loaded_project_local_path, referred_name = handle_uploaded_project_files(request, jobs_list,
server.config['UPLOAD_FOLDER'])
if loaded_project_local_path.lower().endswith('.zip'):
loaded_project_local_path = process_zipped_project(loaded_project_local_path)
try:
return JobImportHandler.create_jobs_from_processed_data(processed_job_data)
results = create_render_jobs(jobs_list, loaded_project_local_path, referred_name)
for response in results:
if response.get('error', None):
return results, 400
if request.args.get('redirect', False):
return redirect(url_for('index'))
else:
return results, 200
except Exception as e:
logger.exception(f"Error creating render job: {e}")
logger.exception(f"Unknown error adding job: {e}")
return 'unknown error', 500
@@ -323,190 +344,103 @@ def cancel_job(job_id):
@server.route('/api/job/<job_id>/delete', methods=['POST', 'GET'])
def delete_job(job_id):
try:
if not request.args.get("confirm", False):
return "Confirmation required to delete job", 400
if not request.args.get('confirm', False):
return 'Confirmation required to delete job', 400
# Check if we can remove the 'output' directory
found_job = RenderQueue.job_with_id(job_id)
input_path = Path(found_job.input_path)
output_path = Path(found_job.output_path)
upload_root = Path(server.config["UPLOAD_FOLDER"])
project_dir = input_path.parent.parent
output_dir = output_path.parent
found_job.stop()
try:
PreviewManager.delete_previews_for_job(found_job)
except Exception as e:
logger.error(f"Error deleting previews for {found_job}: {e}")
RenderQueue.delete_job(found_job)
# Delete output directory if we own it
if output_dir.exists() and output_dir.is_relative_to(upload_root):
output_dir = os.path.dirname(found_job.output_path)
if server.config['UPLOAD_FOLDER'] in output_dir and os.path.exists(output_dir):
shutil.rmtree(output_dir)
# Delete project directory if we own it and it's unused
try:
if project_dir.exists() and project_dir.is_relative_to(upload_root):
project_dir_files = [p for p in project_dir.iterdir() if not p.name.startswith(".")]
if not project_dir_files or (len(project_dir_files) == 1 and "source" in project_dir_files[0].name):
logger.info(f"Removing project directory: {project_dir}")
shutil.rmtree(project_dir)
except Exception as e:
logger.error(f"Error removing project files: {e}")
# Remove any thumbnails
for filename in os.listdir(server.config['THUMBS_FOLDER']):
if job_id in filename:
os.remove(os.path.join(server.config['THUMBS_FOLDER'], filename))
return "Job deleted", 200
thumb_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '.mp4')
if os.path.exists(thumb_path):
os.remove(thumb_path)
# See if we own the project_dir (i.e. was it uploaded)
project_dir = os.path.dirname(os.path.dirname(found_job.input_path))
if server.config['UPLOAD_FOLDER'] in project_dir and os.path.exists(project_dir):
# check to see if any other projects are sharing the same project file
project_dir_files = [f for f in os.listdir(project_dir) if not f.startswith('.')]
if len(project_dir_files) == 0 or (len(project_dir_files) == 1 and 'source' in project_dir_files[0]):
logger.info(f"Removing project directory: {project_dir}")
shutil.rmtree(project_dir)
RenderQueue.delete_job(found_job)
if request.args.get('redirect', False):
return redirect(url_for('index'))
else:
return "Job deleted", 200
except Exception as e:
logger.error(f"Error deleting job: {e}")
return f"Error deleting job: {e}", 500
# --------------------------------------------
# Engine Info and Management:
# --------------------------------------------
@server.get('/api/engine_for_filename')
def get_engine_for_filename():
filename = request.args.get("filename")
if not filename:
return "Error: filename is required", 400
found_engine = EngineManager.engine_class_for_project_path(filename)
if not found_engine:
return f"Error: cannot find a suitable engine for '{filename}'", 400
return found_engine.name()
@server.get('/api/installed_engines')
def get_installed_engines():
result = {}
for engine_class in EngineManager.supported_engines():
data = EngineManager.all_version_data_for_engine(engine_class.name())
if data:
result[engine_class.name()] = data
return result
@server.get('/api/clear_history')
def clear_history():
RenderQueue.clear_history()
return 'success'
@server.get('/api/engine_info')
def engine_info():
response_type = request.args.get('response_type', 'standard')
if response_type not in ['full', 'standard']:
raise ValueError(f"Invalid response_type: {response_type}")
@server.route('/api/status')
def status():
renderer_data = {}
for render_class in EngineManager.supported_engines():
if EngineManager.all_versions_for_engine(render_class.name): # only return renderers installed on host
renderer_data[render_class.engine.name()] = \
{'versions': EngineManager.all_versions_for_engine(render_class.engine.name()),
'is_available': RenderQueue.is_available_for_job(render_class.engine.name())
}
def process_engine(engine):
try:
# Get all installed versions of the engine
installed_versions = EngineManager.all_version_data_for_engine(engine.name())
if not installed_versions:
return None
system_installed_versions = [v for v in installed_versions if v['type'] == 'system']
install_path = system_installed_versions[0]['path'] if system_installed_versions else installed_versions[0]['path']
en = engine(install_path)
engine_name = en.name()
result = {
engine_name: {
'is_available': RenderQueue.is_available_for_job(engine_name),
'versions': installed_versions
}
# Get system info
return {"timestamp": datetime.now().isoformat(),
"system_os": current_system_os(),
"system_os_version": current_system_os_version(),
"system_cpu": current_system_cpu(),
"cpu_percent": psutil.cpu_percent(percpu=False),
"cpu_percent_per_cpu": psutil.cpu_percent(percpu=True),
"cpu_count": psutil.cpu_count(logical=False),
"memory_total": psutil.virtual_memory().total,
"memory_available": psutil.virtual_memory().available,
"memory_percent": psutil.virtual_memory().percent,
"job_counts": RenderQueue.job_counts(),
"renderers": renderer_data,
"hostname": server.config['HOSTNAME'],
"port": server.config['PORT']
}
if response_type == 'full':
with concurrent.futures.ThreadPoolExecutor() as executor:
future_results = {
'supported_extensions': executor.submit(en.supported_extensions),
'supported_export_formats': executor.submit(en.get_output_formats),
'system_info': executor.submit(en.system_info)
}
for key, future in future_results.items():
result[engine_name][key] = future.result()
return result
except Exception as e:
traceback.print_exc(e)
logger.error(f"Error fetching details for engine '{engine.name()}': {e}")
return {}
engine_data = {}
with concurrent.futures.ThreadPoolExecutor() as executor:
futures = {executor.submit(process_engine, engine): engine.name() for engine in EngineManager.supported_engines()}
for future in concurrent.futures.as_completed(futures):
result = future.result()
if result:
engine_data.update(result)
return engine_data
@server.get('/api/<engine_name>/info')
def get_engine_info(engine_name):
try:
response_type = request.args.get('response_type', 'standard')
# Get all installed versions of the engine
installed_versions = EngineManager.all_version_data_for_engine(engine_name)
if not installed_versions:
return {}
result = { 'is_available': RenderQueue.is_available_for_job(engine_name),
'versions': installed_versions
}
if response_type == 'full':
with concurrent.futures.ThreadPoolExecutor() as executor:
engine_class = EngineManager.engine_class_with_name(engine_name)
en = EngineManager.get_latest_engine_instance(engine_class)
future_results = {
'supported_extensions': executor.submit(en.supported_extensions),
'supported_export_formats': executor.submit(en.get_output_formats),
'system_info': executor.submit(en.system_info),
'options': executor.submit(en.ui_options)
}
for key, future in future_results.items():
result[key] = future.result()
return result
except Exception as e:
logger.error(f"Error fetching details for engine '{engine_name}': {e}")
return {}
@server.get('/api/renderer_info')
def renderer_info():
renderer_data = {}
for engine in EngineManager.supported_engines():
# Get all installed versions of engine
installed_versions = EngineManager.all_versions_for_engine(engine.name())
if installed_versions:
# fixme: using system versions only because downloaded versions may have permissions issues
system_installed_versions = [x for x in installed_versions if x['type'] == 'system']
install_path = system_installed_versions[0]['path'] if system_installed_versions else installed_versions[0]['path']
renderer_data[engine.name()] = {'is_available': RenderQueue.is_available_for_job(engine.name()),
'versions': installed_versions,
'supported_extensions': engine.supported_extensions(),
'supported_export_formats': engine(install_path).get_output_formats()}
return renderer_data
@server.get('/api/<engine_name>/is_available')
def is_engine_available(engine_name):
return {'engine': engine_name, 'available': RenderQueue.is_available_for_job(engine_name),
'cpu_count': int(psutil.cpu_count(logical=False)),
'versions': EngineManager.all_version_data_for_engine(engine_name),
'versions': EngineManager.all_versions_for_engine(engine_name),
'hostname': server.config['HOSTNAME']}
@server.get('/api/engine/<engine_name>/args')
def get_engine_args(engine_name):
try:
engine_class = EngineManager.engine_class_with_name(engine_name)
return engine_class().get_arguments()
except LookupError:
return f"Cannot find engine '{engine_name}'", 400
@server.get('/api/engine/<engine_name>/help')
def get_engine_help(engine_name):
try:
engine_class = EngineManager.engine_class_with_name(engine_name)
return engine_class().get_help()
except LookupError:
return f"Cannot find engine '{engine_name}'", 400
# --------------------------------------------
# Engine Downloads and Updates:
# --------------------------------------------
@server.get('/api/is_engine_available_to_download')
def is_engine_available_to_download():
available_result = EngineManager.version_is_available_to_download(request.args.get('engine'),
@@ -547,109 +481,83 @@ def delete_engine_download():
(f"Error deleting {json_data.get('engine')} {json_data.get('version')}", 500)
# --------------------------------------------
# Miscellaneous:
# --------------------------------------------
@server.get('/api/heartbeat')
def heartbeat():
return datetime.now().isoformat(), 200
@server.post('/api/job/<job_id>/send_subjob_update_notification')
def subjob_update_notification(job_id):
subjob_details = request.json
DistributedJobManager.handle_subjob_update_notification(RenderQueue.job_with_id(job_id), subjob_data=subjob_details)
return Response(status=200)
@server.get('/api/renderer/<renderer>/args')
def get_renderer_args(renderer):
try:
renderer_engine_class = EngineManager.engine_with_name(renderer)
return renderer_engine_class().get_arguments()
except LookupError:
return f"Cannot find renderer '{renderer}'", 400
@server.route('/api/job/<job_id>/thumbnail')
def job_thumbnail(job_id):
@server.get('/api/renderer/<renderer>/help')
def get_renderer_help(renderer):
try:
renderer_engine_class = EngineManager.engine_with_name(renderer)
return renderer_engine_class().get_help()
except LookupError:
return f"Cannot find renderer '{renderer}'", 400
@server.route('/upload')
def upload_file_page():
return render_template('upload.html', supported_renderers=EngineManager.supported_engines())
def start_server():
def eval_loop(delay_sec=1):
while True:
RenderQueue.evaluate_queue()
time.sleep(delay_sec)
# get hostname
local_hostname = socket.gethostname()
local_hostname = local_hostname + (".local" if not local_hostname.endswith(".local") else "")
# load flask settings
server.config['HOSTNAME'] = local_hostname
server.config['PORT'] = int(Config.port_number)
server.config['UPLOAD_FOLDER'] = system_safe_path(os.path.expanduser(Config.upload_folder))
server.config['THUMBS_FOLDER'] = system_safe_path(os.path.join(os.path.expanduser(Config.upload_folder), 'thumbs'))
server.config['MAX_CONTENT_PATH'] = Config.max_content_path
server.config['enable_split_jobs'] = Config.enable_split_jobs
# Setup directory for saving engines to
EngineManager.engines_path = system_safe_path(os.path.join(os.path.join(os.path.expanduser(Config.upload_folder),
'engines')))
os.makedirs(EngineManager.engines_path, exist_ok=True)
# Debug info
logger.debug(f"Upload directory: {server.config['UPLOAD_FOLDER']}")
logger.debug(f"Thumbs directory: {server.config['THUMBS_FOLDER']}")
logger.debug(f"Engines directory: {EngineManager.engines_path}")
# disable most Flask logging
flask_log = logging.getLogger('werkzeug')
flask_log.setLevel(Config.flask_log_level.upper())
# check for updates for render engines if config'd or on first launch
if Config.update_engines_on_launch or not EngineManager.all_engines():
EngineManager.update_all_engines()
# Set up the RenderQueue object
RenderQueue.load_state(database_directory=server.config['UPLOAD_FOLDER'])
ServerProxyManager.subscribe_to_listener()
DistributedJobManager.subscribe_to_listener()
thread = threading.Thread(target=eval_loop, kwargs={'delay_sec': Config.queue_eval_seconds}, daemon=True)
thread.start()
logger.info(f"Starting Zordon Render Server - Hostname: '{server.config['HOSTNAME']}:'")
ZeroconfServer.configure("_zordon._tcp.local.", server.config['HOSTNAME'], server.config['PORT'])
ZeroconfServer.properties = {'system_cpu': current_system_cpu(), 'system_cpu_cores': multiprocessing.cpu_count(),
'system_os': current_system_os(),
'system_os_version': current_system_os_version()}
ZeroconfServer.start()
try:
big_thumb = request.args.get('size', False) == "big"
video_ok = request.args.get('video_ok', False)
found_job = RenderQueue.job_with_id(job_id, none_ok=False)
# trigger a thumbnail update - just in case
PreviewManager.update_previews_for_job(found_job, wait_until_completion=True, timeout=60)
previews = PreviewManager.get_previews_for_job(found_job)
all_previews_list = previews.get('output', previews.get('input', []))
video_previews = [x for x in all_previews_list if x['kind'] == 'video']
image_previews = [x for x in all_previews_list if x['kind'] == 'image']
filtered_list = video_previews if video_previews and video_ok else image_previews
# todo - sort by size or other metrics here
if filtered_list:
preview_to_send = filtered_list[0]
mime_types = {'image': 'image/jpeg', 'video': 'video/mp4'}
file_mime_type = mime_types.get(preview_to_send['kind'], 'unknown')
return send_file(preview_to_send['filename'], mimetype=file_mime_type)
except Exception as e:
logger.error(f'Error getting thumbnail: {e}')
return f'Error getting thumbnail: {e}', 500
return "No thumbnail available", 404
# --------------------------------------------
# System Benchmarks:
# --------------------------------------------
@server.get('/api/cpu_benchmark')
def get_cpu_benchmark_score():
from src.utilities.benchmark import cpu_benchmark
return str(cpu_benchmark(10))
@server.get('/api/disk_benchmark')
def get_disk_benchmark():
from src.utilities.benchmark import disk_io_benchmark
results = disk_io_benchmark()
return {'write_speed': results[0], 'read_speed': results[-1]}
# --------------------------------------------
# Error Handlers:
# --------------------------------------------
@server.errorhandler(JobNotFoundError)
def handle_job_not_found(job_error):
return str(job_error), 400
@server.errorhandler(DetachedInstanceError)
def handle_detached_instance(_):
return "Unavailable", 503
@server.errorhandler(404)
def handle_404(error):
url = request.url
err_msg = f"404 Not Found: {url}"
if 'favicon' not in url:
logger.warning(err_msg)
return err_msg, 404
@server.errorhandler(Exception)
def handle_general_error(general_error):
traceback.print_exception(type(general_error), general_error, general_error.__traceback__)
err_msg = f"Server error: {general_error}"
logger.error(err_msg)
return err_msg, 500
# --------------------------------------------
# Debug / Development Only:
# --------------------------------------------
@server.get('/api/_debug/detected_clients')
def detected_clients():
# todo: dev/debug only. Should not ship this - probably.
from src.utilities.zeroconf_server import ZeroconfServer
return ZeroconfServer.found_hostnames()
@server.get('/api/_debug/clear_history')
def clear_history():
RenderQueue.clear_history()
return 'success'
server.run(host='0.0.0.0', port=server.config['PORT'], debug=Config.flask_debug_enable,
use_reloader=False, threaded=True)
finally:
RenderQueue.save_state()
ZeroconfServer.stop()

View File

@@ -1,221 +0,0 @@
#!/usr/bin/env python3
import logging
import os
import shutil
import tempfile
import zipfile
from datetime import datetime
from pathlib import Path
import requests
from tqdm import tqdm
from werkzeug.utils import secure_filename
from distributed_job_manager import DistributedJobManager
logger = logging.getLogger()
class JobImportHandler:
"""Handles job import operations for rendering projects.
This class provides functionality to validate, download, and process
job data and project files for the rendering queue system.
"""
@classmethod
def create_jobs_from_processed_data(cls, processed_job_data: dict) -> list[dict]:
""" Takes processed job data and creates new jobs
Args: processed_job_data: Dictionary containing job information"""
loaded_project_local_path = processed_job_data['__loaded_project_local_path']
# prepare child job data
job_data_to_create = []
if processed_job_data.get("child_jobs"):
for child_job_diffs in processed_job_data["child_jobs"]:
processed_child_job_data = processed_job_data.copy()
processed_child_job_data.pop("child_jobs")
processed_child_job_data.update(child_job_diffs)
job_data_to_create.append(processed_child_job_data)
else:
job_data_to_create.append(processed_job_data)
# create the jobs
created_jobs = []
for job_data in job_data_to_create:
new_job = DistributedJobManager.create_render_job(job_data, loaded_project_local_path)
created_jobs.append(new_job)
# Save notes to .txt
if processed_job_data.get("notes"):
parent_dir = Path(loaded_project_local_path).parent.parent
notes_name = processed_job_data['name'] + "-notes.txt"
with (Path(parent_dir) / notes_name).open("w") as f:
f.write(processed_job_data["notes"])
return [x.json() for x in created_jobs]
@classmethod
def validate_job_data(cls, new_job_data: dict, upload_directory: Path, uploaded_file=None) -> dict:
"""Validates and prepares job data for import.
This method validates the job data dictionary, handles project file
acquisition (upload, download, or local copy), and prepares the job
directory structure.
Args:
new_job_data: Dictionary containing job information including
'name', 'engine_name', and optionally 'url' or 'local_path'.
upload_directory: Base directory for storing uploaded jobs.
uploaded_file: Optional uploaded file object from the request.
Returns:
The validated job data dictionary with additional metadata.
Raises:
KeyError: If required fields 'name' or 'engine_name' are missing.
FileNotFoundError: If no valid project file can be found.
"""
loaded_project_local_path = None
# check for required keys
job_name = new_job_data.get('name')
engine_name = new_job_data.get('engine_name')
if not job_name:
raise KeyError("Missing job name")
if not engine_name:
raise KeyError("Missing engine name")
project_url = new_job_data.get('url', None)
local_path = new_job_data.get('local_path', None)
downloaded_file_url = None
if uploaded_file and uploaded_file.filename:
referred_name = os.path.basename(uploaded_file.filename)
elif project_url:
referred_name, downloaded_file_url = cls.download_project_from_url(project_url)
if not referred_name:
raise FileNotFoundError(f"Error downloading file from URL: {project_url}")
elif local_path and os.path.exists(local_path):
referred_name = os.path.basename(local_path)
else:
raise FileNotFoundError("Cannot find any valid project paths")
# Prepare the local filepath
cleaned_path_name = job_name.replace(' ', '-')
folder_name = f"{cleaned_path_name}-{engine_name}-{datetime.now().strftime('%Y.%m.%d_%H.%M.%S')}"
job_dir = Path(upload_directory) / folder_name
os.makedirs(job_dir, exist_ok=True)
project_source_dir = Path(job_dir) / 'source'
os.makedirs(project_source_dir, exist_ok=True)
# Move projects to their work directories
if uploaded_file and uploaded_file.filename:
# Handle file uploading
filename = secure_filename(uploaded_file.filename)
loaded_project_local_path = Path(project_source_dir) / filename
uploaded_file.save(str(loaded_project_local_path))
logger.info(f"Transfer complete for {loaded_project_local_path.relative_to(upload_directory)}")
elif project_url:
# Handle downloading project from a URL
loaded_project_local_path = Path(project_source_dir) / referred_name
shutil.move(str(downloaded_file_url), str(loaded_project_local_path))
logger.info(f"Download complete for {loaded_project_local_path.relative_to(upload_directory)}")
elif local_path:
# Handle local files
loaded_project_local_path = Path(project_source_dir) / referred_name
shutil.copy(str(local_path), str(loaded_project_local_path))
logger.info(f"Import complete for {loaded_project_local_path.relative_to(upload_directory)}")
if loaded_project_local_path.suffix == ".zip":
loaded_project_local_path = cls.process_zipped_project(loaded_project_local_path)
new_job_data["__loaded_project_local_path"] = loaded_project_local_path
return new_job_data
@staticmethod
def download_project_from_url(project_url: str):
"""Downloads a project file from the given URL.
Downloads the file from the specified URL to a temporary directory
with progress tracking. Returns the filename and temporary path.
Args:
project_url: The URL to download the project file from.
Returns:
A tuple of (filename, temp_file_path) if successful,
(None, None) if download fails.
"""
# This nested function is to handle downloading from a URL
logger.info(f"Downloading project from url: {project_url}")
referred_name = os.path.basename(project_url)
try:
response = requests.get(project_url, stream=True, timeout=300)
if response.status_code == 200:
# Get the total file size from the "Content-Length" header
file_size = int(response.headers.get("Content-Length", 0))
# Create a progress bar using tqdm
progress_bar = tqdm(total=file_size, unit="B", unit_scale=True)
# Open a file for writing in binary mode
downloaded_file_url = os.path.join(tempfile.gettempdir(), referred_name)
with open(downloaded_file_url, "wb") as file:
for chunk in response.iter_content(chunk_size=1024):
if chunk:
# Write the chunk to the file
file.write(chunk)
# Update the progress bar
progress_bar.update(len(chunk))
# Close the progress bar
progress_bar.close()
return referred_name, downloaded_file_url
except Exception as e:
logger.error(f"Error downloading file: {e}")
return None, None
@staticmethod
def process_zipped_project(zip_path: Path) -> Path:
"""
Processes a zipped project.
This method takes a path to a zip file, extracts its contents, and returns the path to the extracted project
file. If the zip file contains more than one project file or none, an error is raised.
Args:
zip_path (Path): The path to the zip file.
Raises:
ValueError: If there's more than 1 project file or none in the zip file.
Returns:
Path: The path to the main project file.
"""
work_path = zip_path.parent
try:
with zipfile.ZipFile(zip_path, 'r') as myzip:
myzip.extractall(str(work_path))
project_files = [p for p in work_path.iterdir() if p.is_file() and p.suffix.lower() != ".zip"]
logger.debug(f"Zip files: {project_files}")
# supported_exts = RenderWorkerFactory.class_for_name(engine).engine.supported_extensions
# if supported_exts:
# project_files = [file for file in project_files if any(file.endswith(ext) for ext in supported_exts)]
# If there's more than 1 project file or none, raise an error
if len(project_files) != 1:
raise ValueError(f'Cannot find a valid project file in {zip_path.name}')
extracted_project_path = work_path / project_files[0]
logger.info(f"Extracted zip file to {extracted_project_path}")
except (zipfile.BadZipFile, zipfile.LargeZipFile) as e:
logger.error(f"Error processing zip file: {e}")
raise ValueError(f'Error processing zip file: {e}') from e
return extracted_project_path

View File

@@ -1,142 +0,0 @@
import logging
import os
import subprocess
import threading
from pathlib import Path
from src.utilities.ffmpeg_helper import generate_thumbnail, save_first_frame
logger = logging.getLogger()
supported_video_formats = ['.mp4', '.mov', '.avi', '.mpg', '.mpeg', '.mxf', '.m4v', '.mkv', '.webm']
supported_image_formats = ['.jpg', '.png', '.exr', '.tif', '.tga', '.bmp', '.webp']
class PreviewManager:
"""Manages generation, storage, and retrieval of preview images and videos for rendering jobs."""
storage_path = None
_running_jobs = {}
@classmethod
def __generate_job_preview_worker(cls, job, replace_existing=False, max_width=480):
"""Generates image and video previews for a given job.
Args:
job: The job object containing file information.
replace_existing (bool): Whether to replace existing previews. Defaults to False.
max_width (int): Maximum width for the preview images/videos. Defaults to 480.
"""
# Determine best source file to use for thumbs
job_file_list = job.file_list()
source_files = job_file_list if job_file_list else [job.input_path]
preview_label = "output" if job_file_list else "input"
# filter by type
found_image_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in supported_image_formats]
found_video_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in supported_video_formats]
# check if we even have any valid files to work from
if source_files and not found_video_files and not found_image_files:
logger.warning(f"No valid image or video files found in files from job: {job}")
return
os.makedirs(cls.storage_path, exist_ok=True)
base_path = os.path.join(cls.storage_path, f"{job.id}-{preview_label}-{max_width}")
preview_video_path = base_path + '.mp4'
preview_image_path = base_path + '.jpg'
if replace_existing:
for x in [preview_image_path, preview_video_path]:
try:
os.remove(x)
except OSError:
pass
# Generate image previews
if (found_video_files or found_image_files) and not os.path.exists(preview_image_path):
try:
path_of_source = found_image_files[-1] if found_image_files else found_video_files[-1]
logger.debug(f"Generating image preview for {path_of_source}")
save_first_frame(source_path=path_of_source, dest_path=preview_image_path, max_width=max_width)
logger.debug(f"Successfully created image preview for {path_of_source}")
except Exception as e:
logger.error(f"Error generating image preview for {job}: {e}")
# Generate video previews
if found_video_files and not os.path.exists(preview_video_path):
try:
path_of_source = found_video_files[0]
logger.debug(f"Generating video preview for {path_of_source}")
generate_thumbnail(source_path=path_of_source, dest_path=preview_video_path, max_width=max_width)
logger.debug(f"Successfully created video preview for {path_of_source}")
except subprocess.CalledProcessError as e:
logger.error(f"Error generating video preview for {job}: {e}")
@classmethod
def update_previews_for_job(cls, job, replace_existing=False, wait_until_completion=False, timeout=None):
"""Updates previews for a given job by starting a background thread.
Args:
job: The job object.
replace_existing (bool): Whether to replace existing previews. Defaults to False.
wait_until_completion (bool): Whether to wait for the thread to complete. Defaults to False.
timeout (float): Timeout for waiting, if applicable.
"""
job_thread = cls._running_jobs.get(job.id)
if job_thread and job_thread.is_alive():
logger.debug(f'Preview generation job already running for {job}')
return
job_thread = threading.Thread(target=cls.__generate_job_preview_worker, args=(job, replace_existing,))
job_thread.start()
cls._running_jobs[job.id] = job_thread
if wait_until_completion:
job_thread.join(timeout=timeout)
@classmethod
def get_previews_for_job(cls, job):
"""Retrieves previews for a given job.
Args:
job: The job object.
Returns:
dict: A dictionary containing preview information.
"""
results = {}
try:
directory_path = Path(cls.storage_path)
preview_files_for_job = [f for f in directory_path.iterdir() if f.is_file() and f.name.startswith(job.id)]
for preview_filename in preview_files_for_job:
try:
pixel_width = str(preview_filename).split('-')[-1]
preview_label = str(os.path.basename(preview_filename)).split('-')[1]
extension = os.path.splitext(preview_filename)[-1].lower()
kind = 'video' if extension in supported_video_formats else \
'image' if extension in supported_image_formats else 'unknown'
results[preview_label] = results.get(preview_label, [])
results[preview_label].append({'filename': str(preview_filename), 'width': pixel_width, 'kind': kind})
except IndexError: # ignore invalid filenames
pass
except FileNotFoundError:
pass
return results
@classmethod
def delete_previews_for_job(cls, job):
"""Deletes all previews associated with a given job.
Args:
job: The job object.
"""
all_previews = cls.get_previews_for_job(job)
flattened_list = [item for sublist in all_previews.values() for item in sublist]
for preview in flattened_list:
try:
logger.debug(f"Removing preview: {preview['filename']}")
os.remove(preview['filename'])
except OSError as e:
logger.error(f"Error removing preview '{preview.get('filename')}': {e}")

View File

@@ -1,13 +1,12 @@
import json
import logging
import os
import socket
import threading
import time
from pathlib import Path
import requests
from requests_toolbelt.multipart import MultipartEncoder, MultipartEncoderMonitor
from urllib.parse import urljoin
from src.utilities.misc_helper import is_localhost
from src.utilities.status_utils import RenderStatus
@@ -17,18 +16,14 @@ status_colors = {RenderStatus.ERROR: "red", RenderStatus.CANCELLED: 'orange1', R
RenderStatus.RUNNING: 'cyan', RenderStatus.WAITING_FOR_SUBJOBS: 'blue'}
categories = [RenderStatus.RUNNING, RenderStatus.WAITING_FOR_SUBJOBS, RenderStatus.ERROR, RenderStatus.NOT_STARTED,
RenderStatus.SCHEDULED, RenderStatus.COMPLETED, RenderStatus.CANCELLED, RenderStatus.UNDEFINED,
RenderStatus.CONFIGURING]
RenderStatus.SCHEDULED, RenderStatus.COMPLETED, RenderStatus.CANCELLED, RenderStatus.UNDEFINED]
logger = logging.getLogger()
OFFLINE_MAX = 4
OFFLINE_MAX = 2
class RenderServerProxy:
"""The ServerProxy class is responsible for interacting with a remote server.
It provides convenience methods to request data from the server and store the status of the server.
"""
def __init__(self, hostname, server_port="8080"):
self.hostname = hostname
self.port = server_port
@@ -39,34 +34,21 @@ class RenderServerProxy:
self.__background_thread = None
self.__offline_flags = 0
self.update_cadence = 5
self.is_localhost = bool(is_localhost(hostname))
# Cache some basic server info
self.system_cpu = None
self.system_cpu_count = None
self.system_os = None
self.system_os_version = None
self.system_api_version = None
# --------------------------------------------
# Basics / Connection:
# --------------------------------------------
def __repr__(self):
return f"<RenderServerProxy - {self.hostname}>"
def check_connection(self):
try:
return self.request("heartbeat").ok
except Exception:
pass
return False
def connect(self):
return self.status()
def is_online(self):
if self.__update_in_background:
return self.__offline_flags < OFFLINE_MAX
else:
return self.check_connection()
return self.connect() is not None
def status(self):
if not self.is_online():
@@ -74,16 +56,11 @@ class RenderServerProxy:
running_jobs = [x for x in self.__jobs_cache if x['status'] == 'running'] if self.__jobs_cache else []
return f"{len(running_jobs)} running" if running_jobs else "Ready"
# --------------------------------------------
# Requests:
# --------------------------------------------
def request_data(self, payload, timeout=5):
try:
req = self.request(payload, timeout)
if req.ok:
if req.ok and req.status_code == 200:
self.__offline_flags = 0
if req.status_code == 200:
return req.json()
except json.JSONDecodeError as e:
logger.debug(f"JSON decode error: {e}")
@@ -95,23 +72,10 @@ class RenderServerProxy:
self.__offline_flags = self.__offline_flags + 1
except Exception as e:
logger.exception(f"Uncaught exception: {e}")
# If server unexpectedly drops off the network, stop background updates
if self.__offline_flags > OFFLINE_MAX:
try:
self.stop_background_update()
except KeyError:
pass
return None
def request(self, payload, timeout=5):
from src.api.api_server import API_VERSION
return requests.get(f'http://{self.hostname}:{self.port}/api/{payload}', timeout=timeout,
headers={"X-API-Version": str(API_VERSION)})
# --------------------------------------------
# Background Updates:
# --------------------------------------------
return requests.get(f'http://{self.hostname}:{self.port}/api/{payload}', timeout=timeout)
def start_background_update(self):
if self.__update_in_background:
@@ -119,23 +83,27 @@ class RenderServerProxy:
self.__update_in_background = True
def thread_worker():
logger.debug(f'Starting background updates for {self.hostname}')
while self.__update_in_background:
self.__update_job_cache()
time.sleep(self.update_cadence)
logger.debug(f'Stopping background updates for {self.hostname}')
self.__background_thread = threading.Thread(target=thread_worker)
self.__background_thread.daemon = True
self.__background_thread.start()
def __update_job_cache(self, timeout=40, ignore_token=False):
def stop_background_update(self):
self.__update_in_background = False
if self.__offline_flags: # if we're offline, don't bother with the long poll
ignore_token = True
def get_job_info(self, job_id, timeout=5):
return self.request_data(f'job/{job_id}', timeout=timeout)
url = f'jobs_long_poll?token={self.__jobs_cache_token}' if (self.__jobs_cache_token and
not ignore_token) else 'jobs'
def get_all_jobs(self, timeout=5, ignore_token=False):
if not self.__update_in_background or ignore_token:
self.__update_job_cache(timeout, ignore_token)
return self.__jobs_cache.copy() if self.__jobs_cache else None
def __update_job_cache(self, timeout=5, ignore_token=False):
url = f'jobs?token={self.__jobs_cache_token}' if self.__jobs_cache_token and not ignore_token else 'jobs'
status_result = self.request_data(url, timeout=timeout)
if status_result is not None:
sorted_jobs = []
@@ -146,83 +114,9 @@ class RenderServerProxy:
self.__jobs_cache = sorted_jobs
self.__jobs_cache_token = status_result['token']
def stop_background_update(self):
self.__update_in_background = False
# --------------------------------------------
# Get System Info:
# --------------------------------------------
def get_all_jobs(self, timeout=5, ignore_token=False):
if not self.__update_in_background or ignore_token:
self.__update_job_cache(timeout, ignore_token)
return self.__jobs_cache.copy() if self.__jobs_cache else None
def get_data(self, timeout=5):
return self.request_data('full_status', timeout=timeout)
def get_status(self):
status = self.request_data('status')
if status and not self.system_cpu:
self.system_cpu = status['system_cpu']
self.system_cpu_count = status['cpu_count']
self.system_os = status['system_os']
self.system_os_version = status['system_os_version']
self.system_api_version = status['api_version']
return status
# --------------------------------------------
# Get Job Info:
# --------------------------------------------
def get_job_info(self, job_id, timeout=5):
return self.request_data(f'job/{job_id}', timeout=timeout)
def get_job_files_list(self, job_id):
return self.request_data(f"job/{job_id}/file_list")
# --------------------------------------------
# Job Lifecycle:
# --------------------------------------------
def post_job_to_server(self, file_path: Path, job_data, callback=None):
"""
Posts a job to the server.
Args:
file_path (Path): The path to the file to upload.
job_data (dict): A dict of jobs data.
callback (function, optional): A callback function to call during the upload. Defaults to None.
Returns:
Response: The response from the server.
"""
# Check if file exists
if not file_path.exists():
raise FileNotFoundError(f"File not found: {file_path}")
# Bypass uploading file if posting to localhost
if self.is_localhost:
job_data['local_path'] = str(file_path)
url = urljoin(f'http://{self.hostname}:{self.port}', '/api/add_job')
headers = {'Content-Type': 'application/json'}
return requests.post(url, data=json.dumps(job_data), headers=headers)
# Prepare the form data for remote host
with open(file_path, 'rb') as file:
encoder = MultipartEncoder({
'file': (file_path.name, file, 'application/octet-stream'),
'json': (None, json.dumps(job_data), 'application/json'),
})
# Create a monitor that will track the upload progress
monitor = MultipartEncoderMonitor(encoder, callback) if callback else MultipartEncoderMonitor(encoder)
headers = {'Content-Type': monitor.content_type}
url = urljoin(f'http://{self.hostname}:{self.port}', '/api/add_job')
# Send the request with proper resource management
with requests.post(url, data=monitor, headers=headers) as response:
return response
all_data = self.request_data('full_status', timeout=timeout)
return all_data
def cancel_job(self, job_id, confirm=False):
return self.request_data(f'job/{job_id}/cancel?confirm={confirm}')
@@ -230,94 +124,68 @@ class RenderServerProxy:
def delete_job(self, job_id, confirm=False):
return self.request_data(f'job/{job_id}/delete?confirm={confirm}')
def send_subjob_update_notification(self, parent_id, subjob):
"""
Notifies the parent job of an update in a subjob.
def get_status(self):
status = self.request_data('status')
if not self.system_cpu:
self.system_cpu = status['system_cpu']
self.system_cpu_count = status['cpu_count']
self.system_os = status['system_os']
self.system_os_version = status['system_os_version']
return status
Args:
parent_id (str): The ID of the parent job.
subjob (Job): The subjob that has updated.
def is_engine_available(self, engine_name):
return self.request_data(f'{engine_name}/is_available')
Returns:
Response: The response from the server.
"""
return requests.post(f'http://{self.hostname}:{self.port}/api/job/{parent_id}/send_subjob_update_notification',
def get_all_engines(self):
return self.request_data('all_engines')
def notify_parent_of_status_change(self, parent_id, subjob):
return requests.post(f'http://{self.hostname}:{self.port}/api/job/{parent_id}/notify_parent_of_status_change',
json=subjob.json())
# --------------------------------------------
# Engines:
# --------------------------------------------
def post_job_to_server(self, file_path, job_list, callback=None):
def get_engine_for_filename(self, filename:str, timeout=5):
response = self.request(f'engine_for_filename?filename={os.path.basename(filename)}', timeout)
return response.text
# bypass uploading file if posting to localhost
if is_localhost(self.hostname):
jobs_with_path = [{**item, "local_path": file_path} for item in job_list]
return requests.post(f'http://{self.hostname}:{self.port}/api/add_job', data=json.dumps(jobs_with_path),
headers={'Content-Type': 'application/json'})
def get_installed_engines(self, timeout=5):
return self.request_data(f'installed_engines', timeout)
# Prepare the form data
encoder = MultipartEncoder({
'file': (os.path.basename(file_path), open(file_path, 'rb'), 'application/octet-stream'),
'json': (None, json.dumps(job_list), 'application/json'),
})
def is_engine_available(self, engine_name:str, timeout=5):
return self.request_data(f'{engine_name}/is_available', timeout)
# Create a monitor that will track the upload progress
if callback:
monitor = MultipartEncoderMonitor(encoder, callback(encoder))
else:
monitor = MultipartEncoderMonitor(encoder)
def get_all_engine_info(self, response_type='standard', timeout=5):
"""
Fetches all engine information from the server.
# Send the request
headers = {'Content-Type': monitor.content_type}
return requests.post(f'http://{self.hostname}:{self.port}/api/add_job', data=monitor, headers=headers)
Args:
response_type (str, optional): Returns standard or full version of engine info
timeout (int, optional): The number of seconds to wait for a response from the server. Defaults to 5.
Returns:
dict: A dictionary containing the engine information.
"""
all_data = self.request_data(f"engine_info?response_type={response_type}", timeout=timeout)
return all_data
def get_engine_info(self, engine_name:str, response_type='standard', timeout=5):
"""
Fetches specific engine information from the server.
Args:
engine_name (str): Name of the engine
response_type (str, optional): Returns standard or full version of engine info
timeout (int, optional): The number of seconds to wait for a response from the server. Defaults to 5.
Returns:
dict: A dictionary containing the engine information.
"""
return self.request_data(f'{engine_name}/info?response_type={response_type}', timeout)
def delete_engine(self, engine_name:str, version:str, system_cpu=None):
"""
Sends a request to the server to delete a specific engine.
Args:
engine_name (str): The name of the engine to delete.
version (str): The version of the engine to delete.
system_cpu (str, optional): The system CPU type. Defaults to None.
Returns:
Response: The response from the server.
"""
form_data = {'engine': engine_name, 'version': version, 'system_cpu': system_cpu}
return requests.post(f'http://{self.hostname}:{self.port}/api/delete_engine', json=form_data)
# --------------------------------------------
# Download Files:
# --------------------------------------------
def download_all_job_files(self, job_id, save_path):
def get_job_files(self, job_id, save_path):
url = f"http://{self.hostname}:{self.port}/api/job/{job_id}/download_all"
return self.__download_file_from_url(url, output_filepath=save_path)
def download_job_file(self, job_id, job_filename, save_path):
url = f"http://{self.hostname}:{self.port}/api/job/{job_id}/download?filename={job_filename}"
return self.__download_file_from_url(url, output_filepath=save_path)
return self.download_file(url, filename=save_path)
@staticmethod
def __download_file_from_url(url, output_filepath):
def download_file(url, filename):
with requests.get(url, stream=True) as r:
r.raise_for_status()
with open(output_filepath, 'wb') as f:
with open(filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=8192):
f.write(chunk)
return output_filepath
return filename
# --- Renderer --- #
def get_renderer_info(self, timeout=5):
all_data = self.request_data(f'renderer_info', timeout=timeout)
return all_data
def delete_engine(self, engine, version, system_cpu=None):
form_data = {'engine': engine, 'version': version, 'system_cpu': system_cpu}
return requests.post(f'http://{self.hostname}:{self.port}/api/delete_engine', json=form_data)

View File

@@ -17,19 +17,19 @@ class ServerProxyManager:
pub.subscribe(cls.__zeroconf_state_change, 'zeroconf_state_change')
@classmethod
def __zeroconf_state_change(cls, hostname, state_change):
def __zeroconf_state_change(cls, hostname, state_change, info):
if state_change == ServiceStateChange.Added or state_change == ServiceStateChange.Updated:
cls.get_proxy_for_hostname(hostname)
else:
cls.get_proxy_for_hostname(hostname).stop_background_update()
cls.server_proxys.pop(hostname)
@classmethod
def get_proxy_for_hostname(cls, hostname):
found_proxy = cls.server_proxys.get(hostname)
if hostname and not found_proxy:
if not found_proxy:
new_proxy = RenderServerProxy(hostname)
new_proxy.start_background_update()
cls.server_proxys[hostname] = new_proxy
found_proxy = new_proxy
return found_proxy

View File

@@ -1,19 +1,15 @@
import logging
import os
import socket
import threading
import time
import zipfile
from click import Path
from plyer import notification
from pubsub import pub
from src.api.preview_manager import PreviewManager
from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue
from src.utilities.config import Config
from src.utilities.server_helper import download_missing_frames_from_subjob, distribute_server_work
from src.utilities.misc_helper import get_file_size_human
from src.utilities.status_utils import RenderStatus, string_to_status
from src.utilities.zeroconf_server import ZeroconfServer
@@ -32,54 +28,17 @@ class DistributedJobManager:
This should be called once, typically during the initialization phase.
"""
pub.subscribe(cls.__local_job_status_changed, 'status_change')
pub.subscribe(cls.__local_job_frame_complete, 'frame_complete')
@classmethod
def __local_job_frame_complete(cls, job_id, frame_number, update_interval=5):
"""
Responds to the 'frame_complete' pubsub message for local jobs.
Args:
job_id (str): The ID of the job that has changed status.
old_status (str): The previous status of the job.
new_status (str): The new (current) status of the job.
Note: Do not call directly. Instead, call via the 'frame_complete' pubsub message.
"""
render_job = RenderQueue.job_with_id(job_id, none_ok=True)
if not render_job: # ignore jobs not in the queue
return
logger.debug(f"Job {job_id} has completed frame #{frame_number}")
replace_existing_previews = (frame_number % update_interval) == 0
cls.__job_update_shared(render_job, replace_existing_previews)
@classmethod
def __job_update_shared(cls, render_job, replace_existing_previews=False):
# update previews
PreviewManager.update_previews_for_job(job=render_job, replace_existing=replace_existing_previews)
# notify parent to allow individual frames to be copied instead of waiting until the end
if render_job.parent:
parent_id, parent_hostname = render_job.parent.split('@')[0], render_job.parent.split('@')[-1]
try:
logger.debug(f'Job {render_job.id} updating parent {parent_id}@{parent_hostname}')
RenderServerProxy(parent_hostname).send_subjob_update_notification(parent_id, render_job)
except Exception as e:
logger.error(f"Error notifying parent {parent_hostname} about update in subjob {render_job.id}: {e}")
@classmethod
def __local_job_status_changed(cls, job_id: str, old_status: str, new_status: str):
def __local_job_status_changed(cls, job_id, old_status, new_status):
"""
Responds to the 'status_change' pubsub message for local jobs.
If it's a child job, it notifies the parent job about the status change.
Args:
job_id (str): The ID of the job that has changed status.
old_status (str): The previous status of the job.
new_status (str): The new (current) status of the job.
Parameters:
job_id (str): The ID of the job that has changed status.
old_status (str): The previous status of the job.
new_status (str): The new (current) status of the job.
Note: Do not call directly. Instead, call via the 'status_change' pubsub message.
"""
@@ -89,34 +48,34 @@ class DistributedJobManager:
return
logger.debug(f"Job {job_id} status change: {old_status} -> {new_status}")
if render_job.parent: # If local job is a subjob from a remote server
parent_id, hostname = render_job.parent.split('@')[0], render_job.parent.split('@')[-1]
RenderServerProxy(hostname).notify_parent_of_status_change(parent_id=parent_id, subjob=render_job)
cls.__job_update_shared(render_job, replace_existing_previews=(render_job.status == RenderStatus.COMPLETED))
# Handle children
if render_job.children:
if new_status in [RenderStatus.CANCELLED, RenderStatus.ERROR]: # Cancel children if necessary
for child in render_job.children:
child_id, child_hostname = child.split('@')
RenderServerProxy(child_hostname).cancel_job(child_id, confirm=True)
# handle cancelling all the children
elif render_job.children and new_status in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
for child in render_job.children:
child_id, hostname = child.split('@')
RenderServerProxy(hostname).cancel_job(child_id, confirm=True)
# UI Notifications
try:
if new_status == RenderStatus.COMPLETED:
logger.debug("Show render complete notification")
logger.debug("show render complete notification")
notification.notify(
title='Render Job Complete',
message=f'{render_job.name} completed succesfully',
timeout=10 # Display time in seconds
)
elif new_status == RenderStatus.ERROR:
logger.debug("Show render error notification")
logger.debug("show render complete notification")
notification.notify(
title='Render Job Failed',
message=f'{render_job.name} failed rendering',
timeout=10 # Display time in seconds
)
elif new_status == RenderStatus.RUNNING:
logger.debug("Show render started notification")
logger.debug("show render complete notification")
notification.notify(
title='Render Job Started',
message=f'{render_job.name} started rendering',
@@ -125,245 +84,276 @@ class DistributedJobManager:
except Exception as e:
logger.debug(f"Unable to show UI notification: {e}")
# --------------------------------------------
# Create Job
# --------------------------------------------
@classmethod
def create_render_job(cls, new_job_attributes: dict, loaded_project_local_path: Path):
"""Creates render jobs. Pass in dict of job_data and the local path to the project. It creates and returns a new
render job.
def handle_subjob_status_change(cls, local_job, subjob_data):
"""
Responds to a status change from a remote subjob and triggers the creation or modification of subjobs as needed.
Args:
new_job_attributes (dict): Dict of desired attributes for new job (frame count, renderer, output path, etc)
loaded_project_local_path (Path): The local path to the loaded project.
Parameters:
local_job (BaseRenderWorker): The local parent job worker.
subjob_data (dict): subjob data sent from remote server.
Returns:
worker: Created job worker
"""
# get new output path in output_dir
output_path = new_job_attributes.get('output_path')
output_filename = loaded_project_local_path.name if output_path else loaded_project_local_path.stem
# Prepare output path
output_dir = loaded_project_local_path.parent.parent / "output"
output_path = output_dir / output_filename
os.makedirs(output_dir, exist_ok=True)
logger.debug(f"New job output path: {output_path}")
# create & configure jobs
worker = EngineManager.create_worker(engine_name=new_job_attributes['engine_name'],
input_path=loaded_project_local_path,
output_path=output_path,
engine_version=new_job_attributes.get('engine_version'),
args=new_job_attributes.get('args', {}),
parent=new_job_attributes.get('parent'),
name=new_job_attributes.get('name'))
worker.status = new_job_attributes.get("initial_status", worker.status) # todo: is this necessary?
worker.priority = int(new_job_attributes.get('priority', worker.priority))
worker.start_frame = int(new_job_attributes.get("start_frame", worker.start_frame))
worker.end_frame = int(new_job_attributes.get("end_frame", worker.end_frame))
worker.watchdog_timeout = Config.worker_process_timeout
worker.hostname = socket.gethostname()
# determine if we can / should split the job
if new_job_attributes.get("enable_split_jobs", False) and (worker.total_frames > 1) and not worker.parent:
cls.split_into_subjobs_async(worker, new_job_attributes, loaded_project_local_path)
else:
worker.status = RenderStatus.NOT_STARTED
RenderQueue.add_to_render_queue(worker, force_start=new_job_attributes.get('force_start', False))
PreviewManager.update_previews_for_job(worker)
return worker
# --------------------------------------------
# Handling Subjobs
# --------------------------------------------
@classmethod
def handle_subjob_update_notification(cls, local_job, subjob_data: dict):
"""Responds to a notification from a remote subjob and the host requests any subsequent updates from the subjob.
Args:
local_job (BaseRenderWorker): The local parent job worker.
subjob_data (dict): Subjob data sent from the remote server.
None
"""
subjob_status = string_to_status(subjob_data['status'])
subjob_id = subjob_data['id']
subjob_hostname = subjob_data['hostname']
subjob_key = f'{subjob_id}@{subjob_hostname}'
old_status = local_job.children.get(subjob_key, {}).get('status')
local_job.children[subjob_key] = subjob_data
subjob_hostname = next((hostname.split('@')[1] for hostname in local_job.children if
hostname.split('@')[0] == subjob_id), None)
local_job.children[f'{subjob_id}@{subjob_hostname}'] = subjob_data
logname = f"<Parent: {local_job.id} | Child: {subjob_key}>"
if old_status != subjob_status.value:
logger.debug(f"Subjob status changed: {logname} -> {subjob_status.value}")
logname = f"{local_job.id}:{subjob_id}@{subjob_hostname}"
logger.debug(f"Subjob status changed: {logname} -> {subjob_status.value}")
download_success = download_missing_frames_from_subjob(local_job, subjob_id, subjob_hostname)
if subjob_data['status'] == 'completed' and download_success:
local_job.children[subjob_key]['download_status'] = 'completed'
# Download complete or partial render jobs
if subjob_status in [RenderStatus.COMPLETED, RenderStatus.CANCELLED, RenderStatus.ERROR] and \
subjob_data['file_count']:
download_result = cls.download_from_subjob(local_job, subjob_id, subjob_hostname)
if not download_result:
# todo: handle error
logger.error(f"Unable to download subjob files from {logname} with status {subjob_status.value}")
@classmethod
def wait_for_subjobs(cls, parent_job):
"""Check the status of subjobs and waits until they are all finished. Download rendered frames from subjobs
when they are completed.
if subjob_status == RenderStatus.CANCELLED or subjob_status == RenderStatus.ERROR:
# todo: determine missing frames and schedule new job
pass
Args:
parent_job: Worker object that has child jobs
@staticmethod
def download_from_subjob(local_job, subjob_id, subjob_hostname):
"""
Downloads and extracts files from a completed subjob on a remote server.
Parameters:
local_job (BaseRenderWorker): The local parent job worker.
subjob_id (str or int): The ID of the subjob.
subjob_hostname (str): The hostname of the remote server where the subjob is located.
Returns:
bool: True if the files have been downloaded and extracted successfully, False otherwise.
"""
logger.debug(f"Waiting for subjobs for job {parent_job}")
parent_job.status = RenderStatus.WAITING_FOR_SUBJOBS
child_key = f'{subjob_id}@{subjob_hostname}'
logname = f"{local_job.id}:{child_key}"
zip_file_path = local_job.output_path + f'_{subjob_hostname}_{subjob_id}.zip'
# download zip file from server
try:
local_job.children[child_key]['download_status'] = 'working'
logger.info(f"Downloading completed subjob files from {subjob_hostname} to localhost")
RenderServerProxy(subjob_hostname).get_job_files(subjob_id, zip_file_path)
logger.info(f"File transfer complete for {logname} - Transferred {get_file_size_human(zip_file_path)}")
except Exception as e:
logger.exception(f"Exception downloading files from remote server: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return False
# extract zip
try:
logger.debug(f"Extracting zip file: {zip_file_path}")
extract_path = os.path.dirname(zip_file_path)
with zipfile.ZipFile(zip_file_path, 'r') as zip_ref:
zip_ref.extractall(extract_path)
logger.info(f"Successfully extracted zip to: {extract_path}")
os.remove(zip_file_path)
local_job.children[child_key]['download_status'] = 'complete'
except Exception as e:
logger.exception(f"Exception extracting zip file: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return local_job.children[child_key].get('download_status', None) == 'complete'
@classmethod
def wait_for_subjobs(cls, local_job):
logger.debug(f"Waiting for subjobs for job {local_job}")
local_job.status = RenderStatus.WAITING_FOR_SUBJOBS
statuses_to_download = [RenderStatus.CANCELLED, RenderStatus.ERROR, RenderStatus.COMPLETED]
def subjobs_not_downloaded():
return {k: v for k, v in parent_job.children.items() if 'download_status' not in v or
return {k: v for k, v in local_job.children.items() if 'download_status' not in v or
v['download_status'] == 'working' or v['download_status'] is None}
logger.info(f'Waiting on {len(subjobs_not_downloaded())} subjobs for {parent_job.id}')
logger.info(f'Waiting on {len(subjobs_not_downloaded())} subjobs for {local_job.id}')
server_delay = 10
sleep_counter = 0
while parent_job.status == RenderStatus.WAITING_FOR_SUBJOBS:
while len(subjobs_not_downloaded()):
for child_key, subjob_cached_data in subjobs_not_downloaded().items():
if sleep_counter % server_delay == 0: # only ping servers every x seconds
for child_key, subjob_cached_data in subjobs_not_downloaded().items():
subjob_id = child_key.split('@')[0]
subjob_hostname = child_key.split('@')[-1]
subjob_id = child_key.split('@')[0]
subjob_hostname = child_key.split('@')[-1]
# Fetch info from server and handle failing case
subjob_data = RenderServerProxy(subjob_hostname).get_job_info(subjob_id)
if not subjob_data:
logger.warning(f"No response from: {subjob_hostname}")
# todo: handle timeout / missing server situations
continue
# Fetch info from server and handle failing case
subjob_data = RenderServerProxy(subjob_hostname).get_job_info(subjob_id)
if not subjob_data:
logger.warning(f"No response from {subjob_hostname}")
# timeout / missing server situations
parent_job.children[child_key]['download_status'] = f'error: No response from {subjob_hostname}'
continue
# Update parent job cache but keep the download status
download_status = local_job.children[child_key].get('download_status', None)
local_job.children[child_key] = subjob_data
local_job.children[child_key]['download_status'] = download_status
# Update parent job cache but keep the download status
download_status = parent_job.children[child_key].get('download_status', None)
parent_job.children[child_key] = subjob_data
parent_job.children[child_key]['download_status'] = download_status
status = string_to_status(subjob_data.get('status', ''))
status_msg = f"Subjob {child_key} | {status} | " \
f"{float(subjob_data.get('percent_complete')) * 100.0}%"
logger.debug(status_msg)
status = string_to_status(subjob_data.get('status', ''))
status_msg = f"Subjob {child_key} | {status} | " \
f"{float(subjob_data.get('percent_complete')) * 100.0}%"
logger.debug(status_msg)
# Still working in another thread - keep waiting
if download_status == 'working':
continue
# Check if job is finished, but has not had files copied yet over yet
if download_status is None and subjob_data['file_count'] and status in statuses_to_download:
try:
download_missing_frames_from_subjob(parent_job, subjob_id, subjob_hostname)
parent_job.children[child_key]['download_status'] = 'complete'
except Exception as e:
logger.error(f"Error downloading missing frames from subjob: {e}")
parent_job.children[child_key]['download_status'] = 'error: {}'
# Check if job is finished, but has not had files copied yet over yet
if download_status is None and subjob_data['file_count'] and status in statuses_to_download:
download_result = cls.download_from_subjob(local_job, subjob_id, subjob_hostname)
if not download_result:
logger.error("Failed to download from subjob")
# todo: error handling here
# Any finished jobs not successfully downloaded at this point are skipped
if parent_job.children[child_key].get('download_status', None) is None and \
status in statuses_to_download:
logger.warning(f"Skipping waiting on downloading from subjob: {child_key}")
parent_job.children[child_key]['download_status'] = 'skipped'
# Any finished jobs not successfully downloaded at this point are skipped
if local_job.children[child_key].get('download_status', None) is None and \
status in statuses_to_download:
logger.warning(f"Skipping waiting on downloading from subjob: {child_key}")
local_job.children[child_key]['download_status'] = 'skipped'
if subjobs_not_downloaded():
logger.debug(f"Waiting on {len(subjobs_not_downloaded())} subjobs on "
f"{', '.join(list(subjobs_not_downloaded().keys()))}")
time.sleep(1)
sleep_counter += 1
else: # exit the loop
parent_job.status = RenderStatus.RUNNING
# --------------------------------------------
# Creating Subjobs
# --------------------------------------------
time.sleep(5)
@classmethod
def split_into_subjobs_async(cls, parent_worker, new_job_attributes, project_path, system_os=None):
# todo: I don't love this
parent_worker.status = RenderStatus.CONFIGURING
cls.background_worker = threading.Thread(target=cls.split_into_subjobs, args=(parent_worker, new_job_attributes,
project_path, system_os))
cls.background_worker.start()
@classmethod
def split_into_subjobs(cls, parent_worker, new_job_attributes, project_path, system_os=None, specific_servers=None):
"""
Splits a job into subjobs and distributes them among available servers.
This method checks the availability of servers, distributes the work among them, and creates subjobs on each
server. If a server is the local host, it adjusts the frame range of the parent job instead of creating a
subjob.
Args:
parent_worker (Worker): The parent job what we're creating the subjobs for.
new_job_attributes (dict): Dict of desired attributes for new job (frame count, engine, output path, etc)
project_path (str): The path to the project.
system_os (str, optional): Required OS. Default is any.
specific_servers (list, optional): List of specific servers to split work between. Defaults to all found.
"""
def split_into_subjobs(cls, worker, job_data, project_path, system_os=None):
# Check availability
available_servers = specific_servers if specific_servers else cls.find_available_servers(parent_worker.engine_name,
system_os)
# skip if theres no external servers found
external_servers = [x for x in available_servers if x['hostname'] != parent_worker.hostname]
if not external_servers:
parent_worker.status = RenderStatus.NOT_STARTED
return
logger.debug(f"Splitting into subjobs - Available servers: {[x['hostname'] for x in available_servers]}")
all_subjob_server_data = distribute_server_work(parent_worker.start_frame, parent_worker.end_frame, available_servers)
available_servers = cls.find_available_servers(worker.renderer, system_os)
logger.debug(f"Splitting into subjobs - Available servers: {available_servers}")
subjob_servers = cls.distribute_server_work(worker.start_frame, worker.end_frame, available_servers)
local_hostname = socket.gethostname()
# Prep and submit these sub-jobs
logger.info(f"Job {parent_worker.id} split plan: {all_subjob_server_data}")
logger.info(f"Job {worker.id} split plan: {subjob_servers}")
try:
for subjob_data in all_subjob_server_data:
subjob_hostname = subjob_data['hostname']
post_results = cls.__create_subjob(new_job_attributes, project_path, subjob_data, subjob_hostname,
parent_worker)
if not post_results.ok:
ValueError(f"Failed to create subjob on {subjob_hostname}")
for server_data in subjob_servers:
server_hostname = server_data['hostname']
if server_hostname != local_hostname:
post_results = cls.__create_subjob(job_data, local_hostname, project_path, server_data,
server_hostname, worker)
if post_results.ok:
server_data['submission_results'] = post_results.json()[0]
else:
logger.error(f"Failed to create subjob on {server_hostname}")
break
else:
# truncate parent render_job
worker.start_frame = max(server_data['frame_range'][0], worker.start_frame)
worker.end_frame = min(server_data['frame_range'][-1], worker.end_frame)
logger.info(f"Local job now rendering from {worker.start_frame} to {worker.end_frame}")
server_data['submission_results'] = worker.json()
# save child info
submission_results = post_results.json()[0]
child_key = f"{submission_results['id']}@{subjob_hostname}"
parent_worker.children[child_key] = submission_results
# check that job posts were all successful.
if not all(d.get('submission_results') is not None for d in subjob_servers):
raise ValueError("Failed to create all subjobs") # look into recalculating job #s and use exising jobs
# start subjobs
logger.debug(f"Created {len(all_subjob_server_data)} subjobs successfully")
parent_worker.name = f"{parent_worker.name} (Parent)"
parent_worker.status = RenderStatus.NOT_STARTED # todo: this won't work with scheduled starts
logger.debug(f"Starting {len(subjob_servers) - 1} attempted subjobs")
for server_data in subjob_servers:
if server_data['hostname'] != local_hostname:
child_key = f"{server_data['submission_results']['id']}@{server_data['hostname']}"
worker.children[child_key] = server_data['submission_results']
worker.name = f"{worker.name}[{worker.start_frame}-{worker.end_frame}]"
except Exception as e:
# cancel all the subjobs
logger.error(f"Failed to split job into subjobs: {e}")
logger.debug(f"Cancelling {len(all_subjob_server_data) - 1} attempted subjobs")
RenderServerProxy(parent_worker.hostname).cancel_job(parent_worker.id, confirm=True)
logger.debug(f"Cancelling {len(subjob_servers) - 1} attempted subjobs")
# [RenderServerProxy(hostname).cancel_job(results['id'], confirm=True) for hostname, results in
# submission_results.items()] # todo: fix this
@staticmethod
def __create_subjob(new_job_attributes: dict, project_path, server_data, server_hostname: str, parent_worker):
"""Convenience method to create subjobs for a parent worker"""
subjob = new_job_attributes.copy()
subjob['name'] = f"{parent_worker.name}[{server_data['frame_range'][0]}-{server_data['frame_range'][-1]}]"
subjob['parent'] = f"{parent_worker.id}@{parent_worker.hostname}"
def __create_subjob(job_data, local_hostname, project_path, server_data, server_hostname, worker):
subjob = job_data.copy()
subjob['name'] = f"{worker.name}[{server_data['frame_range'][0]}-{server_data['frame_range'][-1]}]"
subjob['parent'] = f"{worker.id}@{local_hostname}"
subjob['start_frame'] = server_data['frame_range'][0]
subjob['end_frame'] = server_data['frame_range'][-1]
subjob['engine_version'] = parent_worker.engine_version
logger.debug(f"Posting subjob with frames {subjob['start_frame']}-"
f"{subjob['end_frame']} to {server_hostname}")
post_results = RenderServerProxy(server_hostname).post_job_to_server(
file_path=project_path, job_data=subjob)
file_path=project_path, job_list=[subjob])
return post_results
# --------------------------------------------
# Server Handling
# --------------------------------------------
@staticmethod
def distribute_server_work(start_frame, end_frame, available_servers, method='cpu_count'):
"""
Splits the frame range among available servers proportionally based on their performance (CPU count).
:param start_frame: int, The start frame number of the animation to be rendered.
:param end_frame: int, The end frame number of the animation to be rendered.
:param available_servers: list, A list of available server dictionaries. Each server dictionary should include
'hostname' and 'cpu_count' keys (see find_available_servers)
:param method: str, Optional. Specifies the distribution method. Possible values are 'cpu_count' and 'equally'
:return: A list of server dictionaries where each dictionary includes the frame range and total number of frames
to be rendered by the server.
"""
# Calculate respective frames for each server
def divide_frames_by_cpu_count(frame_start, frame_end, servers):
total_frames = frame_end - frame_start + 1
total_performance = sum(server['cpu_count'] for server in servers)
frame_ranges = {}
current_frame = frame_start
allocated_frames = 0
for i, server in enumerate(servers):
if i == len(servers) - 1: # if it's the last server
# Give all remaining frames to the last server
num_frames = total_frames - allocated_frames
else:
num_frames = round((server['cpu_count'] / total_performance) * total_frames)
allocated_frames += num_frames
frame_end_for_server = current_frame + num_frames - 1
if current_frame <= frame_end_for_server:
frame_ranges[server['hostname']] = (current_frame, frame_end_for_server)
current_frame = frame_end_for_server + 1
return frame_ranges
def divide_frames_equally(frame_start, frame_end, servers):
frame_range = frame_end - frame_start + 1
frames_per_server = frame_range // len(servers)
leftover_frames = frame_range % len(servers)
frame_ranges = {}
current_start = frame_start
for i, server in enumerate(servers):
current_end = current_start + frames_per_server - 1
if leftover_frames > 0:
current_end += 1
leftover_frames -= 1
if current_start <= current_end:
frame_ranges[server['hostname']] = (current_start, current_end)
current_start = current_end + 1
return frame_ranges
if method == 'equally':
breakdown = divide_frames_equally(start_frame, end_frame, available_servers)
# elif method == 'benchmark_score': # todo: implement benchmark score
# pass
else:
breakdown = divide_frames_by_cpu_count(start_frame, end_frame, available_servers)
server_breakdown = [server for server in available_servers if breakdown.get(server['hostname']) is not None]
for server in server_breakdown:
server['frame_range'] = breakdown[server['hostname']]
server['total_frames'] = breakdown[server['hostname']][-1] - breakdown[server['hostname']][0] + 1
return server_breakdown
@staticmethod
def find_available_servers(engine_name: str, system_os=None):
def find_available_servers(engine_name, system_os=None):
"""
Scan the Zeroconf network for currently available render servers supporting a specific engine.
@@ -371,28 +361,12 @@ class DistributedJobManager:
:param system_os: str, Restrict results to servers running a specific OS
:return: A list of dictionaries with each dict containing hostname and cpu_count of available servers
"""
from api.api_server import API_VERSION
found_available_servers = []
available_servers = []
for hostname in ZeroconfServer.found_hostnames():
host_properties = ZeroconfServer.get_hostname_properties(hostname)
if host_properties.get('api_version') == API_VERSION:
if not system_os or (system_os and system_os == host_properties.get('system_os')):
response = RenderServerProxy(hostname).is_engine_available(engine_name)
if response and response.get('available', False):
found_available_servers.append(response)
if not system_os or (system_os and system_os == host_properties.get('system_os')):
response = RenderServerProxy(hostname).is_engine_available(engine_name)
if response and response.get('available', False):
available_servers.append(response)
return found_available_servers
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
ZeroconfServer.configure("_zordon._tcp.local.", 'testing', 8080)
ZeroconfServer.start(listen_only=True)
print("Starting Zeroconf...")
time.sleep(2)
available_servers = DistributedJobManager.find_available_servers('blender')
print(f"AVAILABLE SERVERS ({len(available_servers)}): {available_servers}")
# results = distribute_server_work(1, 100, available_servers)
# print(f"RESULTS: {results}")
ZeroconfServer.stop()
return available_servers

View File

@@ -8,7 +8,7 @@ class AERender(BaseRenderEngine):
def version(self):
version = None
try:
render_path = self.engine_path()
render_path = self.renderer_path()
if render_path:
ver_out = subprocess.check_output([render_path, '-version'], timeout=SUBPROCESS_TIMEOUT)
version = ver_out.decode('utf-8').split(" ")[-1].strip()

View File

@@ -1,6 +1,5 @@
import logging
import re
import threading
import requests
@@ -8,7 +7,8 @@ from src.engines.blender.blender_engine import Blender
from src.engines.core.base_downloader import EngineDownloader
from src.utilities.misc_helper import current_system_os, current_system_cpu
url = "https://download.blender.org/release/"
# url = "https://download.blender.org/release/"
url = "https://ftp.nluug.nl/pub/graphics/blender/release/" # much faster mirror for testing
logger = logging.getLogger()
supported_formats = ['.zip', '.tar.xz', '.dmg']
@@ -43,13 +43,11 @@ class BlenderDownloader(EngineDownloader):
response = requests.get(base_url, timeout=5)
response.raise_for_status()
versions_pattern = \
r'<a href="(?P<file>[^"]+)">blender-(?P<version>[\d\.]+)-(?P<system_os>\w+)-(?P<cpu>\w+).*</a>'
versions_pattern = r'<a href="(?P<file>[^"]+)">blender-(?P<version>[\d\.]+)-(?P<system_os>\w+)-(?P<cpu>\w+).*</a>'
versions_data = [match.groupdict() for match in re.finditer(versions_pattern, response.text)]
# Filter to just the supported formats
versions_data = [item for item in versions_data if any(item["file"].endswith(ext) for ext in
supported_formats)]
versions_data = [item for item in versions_data if any(item["file"].endswith(ext) for ext in supported_formats)]
# Filter down OS and CPU
system_os = system_os or current_system_os()
@@ -80,31 +78,6 @@ class BlenderDownloader(EngineDownloader):
return lts_versions
@classmethod
def all_versions(cls, system_os=None, cpu=None):
majors = cls.__get_major_versions()
all_versions = []
threads = []
results = [[] for _ in majors]
def thread_function(major_version, index, system_os_t, cpu_t):
results[index] = cls.__get_minor_versions(major_version, system_os_t, cpu_t)
for i, m in enumerate(majors):
thread = threading.Thread(target=thread_function, args=(m, i, system_os, cpu))
threads.append(thread)
thread.start()
# Wait for all threads to complete
for thread in threads:
thread.join()
# Extend all_versions with the results from each thread
for result in results:
all_versions.extend(result)
return all_versions
@classmethod
def find_most_recent_version(cls, system_os=None, cpu=None, lts_only=False):
try:
@@ -125,17 +98,18 @@ class BlenderDownloader(EngineDownloader):
return None
@classmethod
def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120, progress_callback=None):
def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120):
system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu()
try:
logger.info(f"Requesting download of blender-{version}-{system_os}-{cpu}")
major_version = '.'.join(version.split('.')[:2])
minor_versions = [x for x in cls.__get_minor_versions(major_version, system_os, cpu) if
x['version'] == version]
minor_versions = [x for x in cls.__get_minor_versions(major_version, system_os, cpu) if x['version'] == version]
# we get the URL instead of calculating it ourselves. May change this
cls.download_and_extract_app(remote_url=minor_versions[0]['url'], download_location=download_location,
timeout=timeout, progress_callback=progress_callback)
timeout=timeout)
except IndexError:
logger.error("Cannot find requested engine")
@@ -143,4 +117,5 @@ class BlenderDownloader(EngineDownloader):
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
print(BlenderDownloader.find_most_recent_version())
print(BlenderDownloader.__get_major_versions())

View File

@@ -1,13 +1,11 @@
import json
import re
from pathlib import Path
from src.engines.core.base_engine import *
from src.utilities.misc_helper import system_safe_path
logger = logging.getLogger()
_creationflags = subprocess.CREATE_NO_WINDOW if platform.system() == 'Windows' else 0
class Blender(BaseRenderEngine):
@@ -24,23 +22,16 @@ class Blender(BaseRenderEngine):
from src.engines.blender.blender_worker import BlenderRenderWorker
return BlenderRenderWorker
def ui_options(self):
options = [
{'name': 'engine', 'options': self.supported_render_engines()},
{'name': 'render_device', 'options': ['Any', 'GPU', 'CPU']},
]
return options
def supported_extensions(self):
@staticmethod
def supported_extensions():
return ['blend']
def version(self):
version = None
try:
render_path = self.engine_path()
render_path = self.renderer_path()
if render_path:
ver_out = subprocess.check_output([render_path, '-v'], timeout=SUBPROCESS_TIMEOUT,
creationflags=_creationflags)
ver_out = subprocess.check_output([render_path, '-v'], timeout=SUBPROCESS_TIMEOUT)
version = ver_out.decode('utf-8').splitlines()[0].replace('Blender', '').strip()
except Exception as e:
logger.error(f'Failed to get Blender version: {e}')
@@ -54,43 +45,32 @@ class Blender(BaseRenderEngine):
def run_python_expression(self, project_path, python_expression, timeout=None):
if os.path.exists(project_path):
try:
return subprocess.run([self.engine_path(), '-b', project_path, '--python-expr', python_expression],
capture_output=True, timeout=timeout, creationflags=_creationflags)
return subprocess.run([self.renderer_path(), '-b', project_path, '--python-expr', python_expression],
capture_output=True, timeout=timeout)
except Exception as e:
err_msg = f"Error running python expression in blender: {e}"
logger.error(err_msg)
raise ChildProcessError(err_msg)
logger.error(f"Error running python expression in blender: {e}")
else:
raise FileNotFoundError(f'Project file not found: {project_path}')
def run_python_script(self, script_path, project_path=None, timeout=None):
if project_path and not os.path.exists(project_path):
def run_python_script(self, project_path, script_path, timeout=None):
if os.path.exists(project_path) and os.path.exists(script_path):
try:
return subprocess.run([self.renderer_path(), '-b', project_path, '--python', script_path],
capture_output=True, timeout=timeout)
except Exception as e:
logger.warning(f"Error running python script in blender: {e}")
pass
elif not os.path.exists(project_path):
raise FileNotFoundError(f'Project file not found: {project_path}')
elif not os.path.exists(script_path):
raise FileNotFoundError(f'Python script not found: {script_path}')
try:
command = [self.engine_path(), '-b', '--python', script_path]
if project_path:
command.insert(2, project_path)
result = subprocess.run(command, capture_output=True, timeout=timeout, creationflags=_creationflags)
return result
except subprocess.TimeoutExpired:
err_msg = f"Timed out after {timeout}s while running python script in blender: {script_path}"
logger.error(err_msg)
raise TimeoutError(err_msg)
except Exception as e:
err_msg = f"Error running python script in blender: {e}"
logger.error(err_msg)
raise ChildProcessError(err_msg)
raise Exception("Uncaught exception")
def get_project_info(self, project_path, timeout=10):
scene_info = {}
try:
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'get_file_info.py')
results = self.run_python_script(project_path=project_path, script_path=Path(script_path),
timeout=timeout)
results = self.run_python_script(project_path, system_safe_path(script_path), timeout=timeout)
result_text = results.stdout.decode()
for line in result_text.splitlines():
if line.startswith('SCENE_DATA:'):
@@ -100,18 +80,15 @@ class Blender(BaseRenderEngine):
elif line.startswith('Error'):
logger.error(f"get_scene_info error: {line.strip()}")
except Exception as e:
msg = f'Error getting file details for .blend file: {e}'
logger.error(msg)
raise ChildProcessError(msg)
logger.error(f'Error getting file details for .blend file: {e}')
return scene_info
def pack_project_file(self, project_path, timeout=None):
def pack_project_file(self, project_path, timeout=30):
# Credit to L0Lock for pack script - https://blender.stackexchange.com/a/243935
try:
logger.info(f"Starting to pack Blender file: {project_path}")
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'pack_project.py')
results = self.run_python_script(project_path=project_path, script_path=Path(script_path),
timeout=timeout)
results = self.run_python_script(project_path, system_safe_path(script_path), timeout=timeout)
result_text = results.stdout.decode()
dir_name = os.path.dirname(project_path)
@@ -119,7 +96,7 @@ class Blender(BaseRenderEngine):
# report any missing textures
not_found = re.findall("(Unable to pack file, source path .*)\n", result_text)
for err in not_found:
raise ChildProcessError(err)
logger.error(err)
p = re.compile('Saved to: (.*)\n')
match = p.search(result_text)
@@ -127,15 +104,12 @@ class Blender(BaseRenderEngine):
new_path = os.path.join(dir_name, match.group(1).strip())
logger.info(f'Blender file packed successfully to {new_path}')
return new_path
return project_path
except Exception as e:
msg = f'Error packing .blend file: {e}'
logger.error(msg)
raise ChildProcessError(msg)
logger.error(f'Error packing .blend file: {e}')
return None
def get_arguments(self):
help_text = subprocess.check_output([self.engine_path(), '-h'], creationflags=_creationflags).decode('utf-8')
help_text = subprocess.check_output([self.renderer_path(), '-h']).decode('utf-8')
lines = help_text.splitlines()
options = {}
@@ -166,32 +140,31 @@ class Blender(BaseRenderEngine):
return options
def system_info(self):
return {'render_devices': self.get_render_devices(), 'engines': self.supported_render_engines()}
def get_render_devices(self):
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'get_system_info.py')
results = self.run_python_script(script_path=script_path)
output = results.stdout.decode()
match = re.search(r"GPU DATA:(\[[\s\S]*\])", output)
if match:
gpu_data_json = match.group(1)
gpus_info = json.loads(gpu_data_json)
return gpus_info
else:
logger.error("GPU data not found in the output.")
def get_detected_gpus(self):
# no longer works on 4.0
engine_output = subprocess.run([self.renderer_path(), '-E', 'help'], timeout=SUBPROCESS_TIMEOUT,
capture_output=True).stdout.decode('utf-8')
gpu_names = re.findall(r"DETECTED GPU: (.+)", engine_output)
return gpu_names
def supported_render_engines(self):
engine_output = subprocess.run([self.engine_path(), '-b', '-E', 'help'], timeout=SUBPROCESS_TIMEOUT,
capture_output=True, creationflags=_creationflags).stdout.decode('utf-8').strip()
engine_output = subprocess.run([self.renderer_path(), '-E', 'help'], timeout=SUBPROCESS_TIMEOUT,
capture_output=True).stdout.decode('utf-8').strip()
render_engines = [x.strip() for x in engine_output.split('Blender Engine Listing:')[-1].strip().splitlines()]
return render_engines
# UI and setup
def get_options(self):
options = [
{'name': 'engine', 'options': self.supported_render_engines()},
]
return options
def perform_presubmission_tasks(self, project_path):
packed_path = self.pack_project_file(project_path, timeout=120)
packed_path = self.pack_project_file(project_path, timeout=30)
return packed_path
if __name__ == "__main__":
x = Blender().get_render_devices()
x = Blender.get_detected_gpus()
print(x)

View File

@@ -12,71 +12,41 @@ class BlenderRenderWorker(BaseRenderWorker):
engine = Blender
def __init__(self, input_path, output_path, engine_path, args=None, parent=None, name=None):
super(BlenderRenderWorker, self).__init__(input_path=input_path, output_path=output_path, engine_path=engine_path, args=args, parent=parent, name=name)
super(BlenderRenderWorker, self).__init__(input_path=input_path, output_path=output_path,
engine_path=engine_path, args=args, parent=parent, name=name)
# Args
self.blender_engine = self.args.get('engine', 'BLENDER_EEVEE').upper()
self.export_format = self.args.get('export_format', None) or 'JPEG'
self.camera = self.args.get('camera', None)
# Stats
self.__frame_percent_complete = 0.0
# Scene Info
self.scene_info = {}
self.scene_info = Blender(engine_path).get_project_info(input_path)
self.start_frame = int(self.scene_info.get('start_frame', 1))
self.end_frame = int(self.scene_info.get('end_frame', self.start_frame))
self.project_length = (self.end_frame - self.start_frame) + 1
self.current_frame = -1
def generate_worker_subprocess(self):
self.scene_info = Blender(self.engine_path).get_project_info(self.input_path)
cmd = [self.engine_path]
cmd = [self.renderer_path]
if self.args.get('background', True): # optionally run render not in background
cmd.append('-b')
cmd.append(self.input_path)
# Set Render Engine
blender_engine = self.args.get('engine')
if blender_engine:
blender_engine = blender_engine.upper()
cmd.extend(['-E', blender_engine])
# Start Python expressions - # todo: investigate splitting into separate 'setup' script
# Python expressions
cmd.append('--python-expr')
python_exp = 'import bpy; bpy.context.scene.render.use_overwrite = False;'
# Setup Custom Resolution
if self.args.get('resolution'):
res = self.args.get('resolution')
python_exp += 'bpy.context.scene.render.resolution_percentage = 100;'
python_exp += f'bpy.context.scene.render.resolution_x={res[0]}; bpy.context.scene.render.resolution_y={res[1]};'
# Setup Custom Camera
custom_camera = self.args.get('camera', None)
if custom_camera:
python_exp += f"bpy.context.scene.camera = bpy.data.objects['{custom_camera}'];"
# Set Render Device for Cycles (gpu/cpu/any)
if blender_engine == 'CYCLES':
render_device = self.args.get('render_device', 'any').lower()
if render_device not in {'any', 'gpu', 'cpu'}:
raise AttributeError(f"Invalid Cycles render device: {render_device}")
use_gpu = render_device in {'any', 'gpu'}
use_cpu = render_device in {'any', 'cpu'}
python_exp = python_exp + ("exec(\"for device in bpy.context.preferences.addons["
f"'cycles'].preferences.devices: device.use = {use_cpu} if device.type == 'CPU'"
f" else {use_gpu}\")")
# -- insert any other python exp checks / generators here --
# End Python expressions here
if self.camera:
python_exp = python_exp + f"bpy.context.scene.camera = bpy.data.objects['{self.camera}'];"
# insert any other python exp checks here
cmd.append(python_exp)
# Export format
export_format = self.args.get('export_format', None) or 'JPEG'
main_part, ext = os.path.splitext(self.output_path)
# Remove the extension only if it is not composed entirely of digits
path_without_ext = main_part if not ext[1:].isdigit() else self.output_path
path_without_ext += "_"
cmd.extend(['-o', path_without_ext, '-F', export_format])
path_without_ext = os.path.splitext(self.output_path)[0] + "_"
cmd.extend(['-E', self.blender_engine, '-o', path_without_ext, '-F', self.export_format])
# set frame range
cmd.extend(['-s', self.start_frame, '-e', self.end_frame, '-a'])
@@ -90,15 +60,11 @@ class BlenderRenderWorker(BaseRenderWorker):
def _parse_stdout(self, line):
cycles_pattern = re.compile(
pattern = re.compile(
r'Fra:(?P<frame>\d*).*Mem:(?P<memory>\S+).*Time:(?P<time>\S+)(?:.*Remaining:)?(?P<remaining>\S*)')
cycles_match = cycles_pattern.search(line)
eevee_pattern = re.compile(
r"Rendering\s+(?P<current>\d+)\s*/\s*(?P<total>\d+)\s+samples"
)
eevee_match = eevee_pattern.search(line)
if cycles_match:
stats = cycles_match.groupdict()
found = pattern.search(line)
if found:
stats = found.groupdict()
memory_use = stats['memory']
time_elapsed = stats['time']
time_remaining = stats['remaining'] or 'Unknown'
@@ -113,40 +79,27 @@ class BlenderRenderWorker(BaseRenderWorker):
logger.debug(
'Frame:{0} | Mem:{1} | Time:{2} | Remaining:{3}'.format(self.current_frame, memory_use,
time_elapsed, time_remaining))
elif eevee_match:
self.__frame_percent_complete = int(eevee_match.groups()[0]) / int(eevee_match.groups()[1])
logger.debug(f'Frame:{self.current_frame} | Samples:{eevee_match.groups()[0]}/{eevee_match.groups()[1]}')
elif "Rendering frame" in line: # used for eevee
self.current_frame = int(line.split("Rendering frame")[-1].strip())
elif "file doesn't exist" in line.lower():
self.log_error(line, halt_render=True)
elif line.lower().startswith('error'):
self.log_error(line)
elif 'Saved' in line or 'Saving' in line or 'quit' in line:
render_stats_match = re.match(r'Time: (.*) \(Saving', line)
output_filename_match = re.match(r"Saved: .*_(\d+)\.\w+", line) # try to get frame # from filename
if output_filename_match:
output_file_number = output_filename_match.groups()[0]
try:
self.current_frame = int(output_file_number)
self._send_frame_complete_notification()
except ValueError:
pass
elif render_stats_match:
time_completed = render_stats_match.groups()[0]
match = re.match(r'Time: (.*) \(Saving', line)
if match:
time_completed = match.groups()[0]
frame_count = self.current_frame - self.end_frame + self.total_frames
logger.info(f'Frame #{self.current_frame} - '
f'{frame_count} of {self.total_frames} completed in {time_completed} | '
f'Total Elapsed Time: {datetime.now() - self.start_time}')
else:
logger.debug(line)
else:
pass
# if len(line.strip()):
# logger.debug(line.strip())
def percent_complete(self):
if self.status == RenderStatus.COMPLETED:
return 1
elif self.total_frames <= 1:
if self.total_frames <= 1:
return self.__frame_percent_complete
else:
whole_frame_percent = (self.current_frame - self.start_frame) / self.total_frames

View File

@@ -1,21 +0,0 @@
import bpy
import json
# Force CPU rendering
bpy.context.preferences.addons["cycles"].preferences.compute_device_type = "NONE"
bpy.context.scene.cycles.device = "CPU"
# Ensure Cycles is available
bpy.context.preferences.addons['cycles'].preferences.get_devices()
# Collect the devices information
devices_info = []
for device in bpy.context.preferences.addons['cycles'].preferences.devices:
devices_info.append({
"name": device.name,
"type": device.type,
"use": device.use
})
# Print the devices information in JSON format
print("GPU DATA:" + json.dumps(devices_info))

View File

@@ -1,7 +1,9 @@
import logging
import os
import shutil
import tarfile
import tempfile
import zipfile
import requests
from tqdm import tqdm
@@ -10,154 +12,26 @@ logger = logging.getLogger()
class EngineDownloader:
"""A class responsible for downloading and extracting rendering engines from publicly available URLs.
Attributes:
supported_formats (list[str]): A list of file formats supported by the downloader.
"""
supported_formats = ['.zip', '.tar.xz', '.dmg']
def __init__(self):
pass
# --------------------------------------------
# Required Overrides for Subclasses:
# --------------------------------------------
@classmethod
def find_most_recent_version(cls, system_os=None, cpu=None, lts_only=False):
"""
Finds the most recent version of the rendering engine available for download.
This method should be overridden in a subclass to implement the logic for determining
the most recent version of the rendering engine, optionally filtering by long-term
support (LTS) versions, the operating system, and CPU architecture.
Args:
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
lts_only (bool, optional): Limit the search to LTS (long-term support) versions only. Default is False.
Returns:
dict: A dict with the following keys:
- 'cpu' (str): The CPU architecture.
- 'system_os' (str): The operating system.
- 'file' (str): The filename of the version's download file.
- 'url' (str): The remote URL for downloading the version.
- 'version' (str): The version number.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"find_most_recent_version not implemented for {cls.__class__.__name__}")
raise NotImplementedError # implement this method in your engine subclass
@classmethod
def version_is_available_to_download(cls, version, system_os=None, cpu=None):
"""Checks if a requested version of the rendering engine is available for download.
This method should be overridden in a subclass to implement the logic for determining
whether a given version of the rendering engine is available for download, based on the
operating system and CPU architecture.
Args:
version (str): The requested renderer version to download.
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
Returns:
bool: True if the version is available for download, False otherwise.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"version_is_available_to_download not implemented for {cls.__class__.__name__}")
raise NotImplementedError # implement this method in your engine subclass
@classmethod
def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120, progress_callback=None):
"""Downloads the requested version of the rendering engine to the given download location.
This method should be overridden in a subclass to implement the logic for downloading
a specific version of the rendering engine. The method is intended to handle the
downloading process based on the version, operating system, CPU architecture, and
timeout parameters.
Args:
version (str): The requested renderer version to download.
download_location (str): The directory where the engine should be downloaded.
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
timeout (int, optional): The maximum time in seconds to wait for the download. Default is 120 seconds.
progress_callback (callable, optional): A callback function that is called periodically with the current download progress.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"download_engine not implemented for {cls.__class__.__name__}")
# --------------------------------------------
# Optional Overrides for Subclasses:
# --------------------------------------------
def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120):
raise NotImplementedError # implement this method in your engine subclass
@classmethod
def all_versions(cls, system_os=None, cpu=None):
"""Retrieves a list of available versions of the software for a specific operating system and CPU architecture.
This method fetches all available versions for the given operating system and CPU type, constructing
a list of dictionaries containing details such as the version, CPU architecture, system OS, and the
remote URL for downloading each version.
Args:
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
Returns:
list[dict]: A list of dictionaries, each containing:
- 'cpu' (str): The CPU architecture.
- 'file' (str): The filename of the version's download file.
- 'system_os' (str): The operating system.
- 'url' (str): The remote URL for downloading the version.
- 'version' (str): The version number.
"""
return []
# --------------------------------------------
# Do Not Override These Methods:
# --------------------------------------------
@classmethod
def download_and_extract_app(cls, remote_url, download_location, timeout=120, progress_callback=None):
"""Downloads an application from the given remote URL and extracts it to the specified location.
This method handles the downloading of the application, supports multiple archive formats,
and extracts the contents to the specified `download_location`. It also manages temporary
files and logs progress throughout the process.
Args:
remote_url (str): The URL of the application to download.
download_location (str): The directory where the application should be extracted.
timeout (int, optional): The maximum time in seconds to wait for the download. Default is 120 seconds.
progress_callback (callable, optional): A callback function that is called periodically with the current download progress.
Returns:
str: The path to the directory where the application was extracted.
Raises:
Exception: Catches and logs any exceptions that occur during the download or extraction process.
Supported Formats:
- `.tar.xz`: Extracted using the `tarfile` module.
- `.zip`: Extracted using the `zipfile` module.
- `.dmg`: macOS disk image files, handled using the `dmglib` library.
- Other formats will result in an error being logged.
Notes:
- If the application already exists in the `download_location`, the method will log an error
and return without downloading or extracting.
- Temporary files created during the download process are cleaned up after completion.
"""
if progress_callback:
progress_callback(0)
def download_and_extract_app(cls, remote_url, download_location, timeout=120):
# Create a temp download directory
temp_download_dir = tempfile.mkdtemp()
@@ -170,7 +44,7 @@ class EngineDownloader:
if os.path.exists(os.path.join(download_location, output_dir_name)):
logger.error(f"Engine download for {output_dir_name} already exists")
return None
return
if not os.path.exists(temp_downloaded_file_path):
# Make a GET request to the URL with stream=True to enable streaming
@@ -186,33 +60,26 @@ class EngineDownloader:
progress_bar = tqdm(total=file_size, unit="B", unit_scale=True)
# Open a file for writing in binary mode
total_saved = 0
with open(temp_downloaded_file_path, "wb") as file:
for chunk in response.iter_content(chunk_size=1024):
if chunk:
# Write the chunk to the file
file.write(chunk)
total_saved += len(chunk)
# Update the progress bar
progress_bar.update(len(chunk))
if progress_callback:
percent = float(total_saved) / float(file_size)
progress_callback(percent)
# Close the progress bar
progress_callback(1.0)
progress_bar.close()
logger.info(f"Successfully downloaded {os.path.basename(temp_downloaded_file_path)}")
else:
logger.error(f"Failed to download the file. Status code: {response.status_code}")
return None
return
os.makedirs(download_location, exist_ok=True)
# Extract the downloaded file
# Process .tar.xz files
if temp_downloaded_file_path.lower().endswith('.tar.xz'):
import tarfile
try:
with tarfile.open(temp_downloaded_file_path, 'r:xz') as tar:
tar.extractall(path=download_location)
@@ -226,13 +93,12 @@ class EngineDownloader:
# Process .zip files
elif temp_downloaded_file_path.lower().endswith('.zip'):
import zipfile
try:
with zipfile.ZipFile(temp_downloaded_file_path, 'r') as zip_ref:
zip_ref.extractall(download_location)
logger.info(
f'Successfully extracted {os.path.basename(temp_downloaded_file_path)} to {download_location}')
except zipfile.BadZipFile:
except zipfile.BadZipFile as e:
logger.error(f'Error: {temp_downloaded_file_path} is not a valid ZIP file.')
except FileNotFoundError:
logger.error(f'File not found: {temp_downloaded_file_path}')
@@ -244,8 +110,7 @@ class EngineDownloader:
for mount_point in dmg.attach():
try:
copy_directory_contents(mount_point, os.path.join(download_location, output_dir_name))
logger.info(f'Successfully copied {os.path.basename(temp_downloaded_file_path)} '
f'to {download_location}')
logger.info(f'Successfully copied {os.path.basename(temp_downloaded_file_path)} to {download_location}')
except FileNotFoundError:
logger.error(f'Error: The source .app bundle does not exist.')
except PermissionError:

View File

@@ -1,196 +1,31 @@
import logging
import os
import platform
import subprocess
from typing import Optional, List, Dict, Any, Type
logger = logging.getLogger()
SUBPROCESS_TIMEOUT = 10
SUBPROCESS_TIMEOUT = 5
class BaseRenderEngine(object):
"""Base class for render engines. This class provides common functionality and structure for various rendering
engines. Create subclasses and override the methods marked below to add additional engines
Attributes:
install_paths (list): A list of default installation paths where the render engine
might be found. This list can be populated with common paths to help locate the
executable on different operating systems or environments.
"""
install_paths = []
supported_extensions = []
install_paths: List[str] = []
binary_names: Dict[str, str] = {}
def __init__(self, custom_path=None):
self.custom_renderer_path = custom_path
if not self.renderer_path():
raise FileNotFoundError(f"Cannot find path to renderer for {self.name()} instance")
# --------------------------------------------
# Required Overrides for Subclasses:
# --------------------------------------------
def __init__(self, custom_path: Optional[str] = None) -> None:
"""Initialize the render engine.
Args:
custom_path: Optional custom path to the engine executable.
Raises:
FileNotFoundError: If the engine executable cannot be found.
"""
self.custom_engine_path: Optional[str] = custom_path
if not self.engine_path() or not os.path.exists(self.engine_path()):
raise FileNotFoundError(f"Cannot find path to engine for {self.name()} instance: {self.engine_path()}")
if not os.access(self.engine_path(), os.X_OK):
logger.warning(f"Path is not executable. Setting permissions to 755 for {self.engine_path()}")
os.chmod(self.engine_path(), 0o755)
def version(self) -> str:
"""Return the version number as a string.
Returns:
str: Version number.
Raises:
NotImplementedError: If not overridden.
"""
raise NotImplementedError(f"version not implemented for {self.__class__.__name__}")
def get_project_info(self, project_path: str, timeout: int = 10) -> Dict[str, Any]:
"""Extracts detailed project information from the given project path.
Args:
project_path (str): The path to the project file.
timeout (int, optional): The maximum time (in seconds) to wait for the operation. Default is 10 seconds.
Returns:
dict: A dictionary containing project information (subclasses should define the structure).
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"get_project_info not implemented for {self.__class__.__name__}")
def renderer_path(self):
return self.custom_renderer_path or self.default_renderer_path()
@classmethod
def get_output_formats(cls) -> List[str]:
"""Returns a list of available output formats supported by the engine.
Returns:
list[str]: A list of strings representing the available output formats.
"""
raise NotImplementedError(f"get_output_formats not implemented for {cls.__name__}")
@staticmethod
def worker_class() -> Type[Any]: # override when subclassing to link worker class
raise NotImplementedError("Worker class not implemented")
# --------------------------------------------
# Optional Overrides for Subclasses:
# --------------------------------------------
def supported_extensions(self) -> List[str]:
"""Return a list of file extensions supported by this engine.
Returns:
List[str]: List of supported file extensions (e.g., ['.blend', '.mp4']).
"""
return []
def get_help(self) -> str:
"""Retrieves the help documentation for the engine.
This method runs the engine's help command (default: '-h') and captures the output.
Override this method if the engine uses a different help flag.
Returns:
str: The help documentation as a string.
Raises:
FileNotFoundError: If the engine path is not found.
"""
path = self.engine_path()
if not path:
raise FileNotFoundError(f"Engine path not found: {path}")
creationflags = subprocess.CREATE_NO_WINDOW if platform.system() == 'Windows' else 0
help_doc = subprocess.check_output([path, '-h'], stderr=subprocess.STDOUT,
timeout=SUBPROCESS_TIMEOUT, creationflags=creationflags).decode('utf-8')
return help_doc
def system_info(self) -> Dict[str, Any]:
"""Return additional information about the system specfic to the engine (configured GPUs, render engines, etc)
Returns:
dict: A dictionary with engine-specific system information
"""
return {}
def perform_presubmission_tasks(self, project_path: str) -> str:
"""Perform any pre-submission tasks on a project file before uploading it to a server (pack textures, etc.)
Override this method to:
1. Copy the project file to a temporary location (DO NOT MODIFY ORIGINAL PATH).
2. Perform additional modifications or tasks.
3. Return the path to the modified project file.
Args:
project_path (str): The original project file path.
Returns:
str: The path to the modified project file.
"""
return project_path
def get_arguments(self) -> Dict[str, Any]:
"""Return command-line arguments for this engine.
Returns:
Dict[str, Any]: Dictionary of argument specifications.
"""
return {}
@staticmethod
def downloader() -> Optional[Any]:
"""Return the downloader class for this engine.
Returns:
Optional[Any]: Downloader class instance, or None if no downloader is used.
"""
return None
def ui_options(self) -> Dict[str, Any]:
"""Return UI configuration options for this engine.
Returns:
Dict[str, Any]: Dictionary of UI options and their configurations.
"""
return {}
# --------------------------------------------
# Do Not Override These Methods:
# --------------------------------------------
def engine_path(self) -> Optional[str]:
"""Return the path to the engine executable.
Returns:
Optional[str]: Path to the engine executable, or None if not found.
"""
return self.custom_engine_path or self.default_engine_path()
@classmethod
def name(cls) -> str:
"""Return the name of this engine.
Returns:
str: Engine name in lowercase.
"""
def name(cls):
return str(cls.__name__).lower()
@classmethod
def default_engine_path(cls) -> Optional[str]:
"""Find the default path to the engine executable.
Returns:
Optional[str]: Default path to the engine executable, or None if not found.
"""
path: Optional[str] = None
def default_renderer_path(cls):
path = None
try: # Linux and macOS
path = subprocess.check_output(['which', cls.name()], timeout=SUBPROCESS_TIMEOUT).decode('utf-8').strip()
except (subprocess.CalledProcessError, FileNotFoundError):
@@ -200,3 +35,40 @@ class BaseRenderEngine(object):
except Exception as e:
logger.exception(e)
return path
def version(self):
raise NotImplementedError("version not implemented")
@staticmethod
def downloader(): # override when subclassing if using a downloader class
return None
@staticmethod
def worker_class(): # override when subclassing to link worker class
raise NotImplementedError("Worker class not implemented")
def get_help(self): # override if renderer uses different help flag
path = self.renderer_path()
if not path:
raise FileNotFoundError("renderer path not found")
help_doc = subprocess.check_output([path, '-h'], stderr=subprocess.STDOUT,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8')
return help_doc
def get_project_info(self, project_path, timeout=10):
raise NotImplementedError(f"get_project_info not implemented for {self.__name__}")
@classmethod
def get_output_formats(cls):
raise NotImplementedError(f"get_output_formats not implemented for {cls.__name__}")
@classmethod
def get_arguments(cls):
pass
def get_options(self): # override to return options for ui
return {}
def perform_presubmission_tasks(self, project_path):
return project_path

View File

@@ -3,18 +3,14 @@ import io
import json
import logging
import os
import signal
import subprocess
import threading
import time
from datetime import datetime
from typing import Optional, Dict, Any, List, Union
import psutil
from pubsub import pub
from sqlalchemy import Column, Integer, String, DateTime, JSON
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.ext.mutable import MutableDict
from src.utilities.misc_helper import get_time_elapsed
from src.utilities.status_utils import RenderStatus, string_to_status
@@ -27,81 +23,60 @@ class BaseRenderWorker(Base):
__tablename__ = 'render_workers'
id = Column(String, primary_key=True)
hostname = Column(String, nullable=True)
input_path = Column(String)
output_path = Column(String)
date_created = Column(DateTime)
start_time = Column(DateTime, nullable=True)
end_time = Column(DateTime, nullable=True)
engine_name = Column(String)
engine_version = Column(String)
engine_path = Column(String)
renderer = Column(String)
renderer_version = Column(String)
renderer_path = Column(String)
priority = Column(Integer)
project_length = Column(Integer)
start_frame = Column(Integer)
end_frame = Column(Integer, nullable=True)
parent = Column(String, nullable=True)
children = Column(MutableDict.as_mutable(JSON))
args = Column(MutableDict.as_mutable(JSON))
children = Column(JSON)
name = Column(String)
file_hash = Column(String)
_status = Column(String)
# --------------------------------------------
# Required Overrides for Subclasses:
# --------------------------------------------
engine = None
def __init__(self, input_path: str, output_path: str, engine_path: str, priority: int = 2,
args: Optional[Dict[str, Any]] = None, ignore_extensions: bool = True,
parent: Optional[str] = None, name: Optional[str] = None) -> None:
"""Initialize a render worker.
def __init__(self, input_path, output_path, engine_path, priority=2, args=None, ignore_extensions=True, parent=None,
name=None):
Args:
input_path: Path to the input project file.
output_path: Path where output files will be saved.
engine_path: Path to the render engine executable.
priority: Job priority level (default: 2).
args: Additional arguments for the render job.
ignore_extensions: Whether to ignore file extension validation.
parent: Parent job ID for distributed jobs.
name: Custom name for the job.
Raises:
ValueError: If file extension is not supported.
NotImplementedError: If engine is not defined.
"""
if not ignore_extensions:
if not any(ext in input_path for ext in self.engine.supported_extensions()):
err_meg = f"Cannot find valid project with supported file extension for '{self.engine.name()}'"
err_meg = f'Cannot find valid project with supported file extension for {self.engine.name()} renderer'
logger.error(err_meg)
raise ValueError(err_meg)
if not self.engine:
raise NotImplementedError(f"Engine not defined for {self.__class__.__name__}")
raise NotImplementedError("Engine not defined")
def generate_id():
"""Generate a unique job ID."""
import uuid
return str(uuid.uuid4()).split('-')[0]
# Essential Info
self.id = generate_id()
self.hostname = None
self.input_path = input_path
self.output_path = output_path
self.args = args or {}
self.date_created = datetime.now()
self.engine_name = self.engine.name()
self.engine_path = engine_path
self.engine_version = self.engine(engine_path).version()
self.custom_engine_path = None
self.renderer = self.engine.name()
self.renderer_path = engine_path
self.renderer_version = self.engine(engine_path).version()
self.custom_renderer_path = None
self.priority = priority
self.parent = parent
self.children = {}
self.name = name or os.path.basename(input_path)
self.maximum_attempts = 3
# Frame Ranges
self.current_frame = 0
self.start_frame = 0
self.project_length = -1
self.current_frame = 0 # should this be a 1 ?
self.start_frame = 0 # should this be a 1 ?
self.end_frame = None
# Logging
@@ -114,73 +89,14 @@ class BaseRenderWorker(Base):
self.errors = []
# Threads and processes
self.__thread = threading.Thread(target=self.__run, args=())
self.__thread = threading.Thread(target=self.run, args=())
self.__thread.daemon = True
self.__process = None
self.last_output = None
self.__last_output_time = None
self.watchdog_timeout = 120
def generate_worker_subprocess(self) -> List[str]:
"""Generate a return a list of the command line arguments necessary to perform requested job
Returns:
list[str]: list of command line arguments
"""
raise NotImplementedError("generate_worker_subprocess not implemented")
def _parse_stdout(self, line: str) -> None:
"""Parses a line of standard output from the engine.
This method should be overridden in a subclass to implement the logic for processing
and interpreting a single line of output from the engine's standard output stream.
On frame completion, the subclass should:
1. Update value of self.current_frame
2. Call self._send_frame_complete_notification()
Args:
line (str): A line of text from the engine's standard output.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"_parse_stdout not implemented for {self.__class__.__name__}")
# --------------------------------------------
# Optional Overrides for Subclasses:
# --------------------------------------------
def percent_complete(self) -> float:
"""Return the completion percentage of this job.
Returns:
float: Completion percentage between 0.0 and 1.0.
"""
# todo: fix this
if self.status == RenderStatus.COMPLETED:
return 1.0
return 0
def post_processing(self) -> None:
"""Override to perform any engine-specific postprocessing"""
pass
# --------------------------------------------
# Do Not Override These Methods:
# --------------------------------------------
def __repr__(self):
"""Return string representation of the worker.
Returns:
str: String representation showing job details.
"""
return f"<Job id:{self.id} p{self.priority} {self.engine_name}-{self.engine_version} '{self.name}' status:{self.status.value}>"
@property
def total_frames(self):
return max(self.end_frame, 1) - self.start_frame + 1
return (self.end_frame or self.project_length) - self.start_frame + 1
@property
def status(self):
@@ -200,316 +116,158 @@ class BaseRenderWorker(Base):
self._status = RenderStatus.CANCELLED.value
return string_to_status(self._status)
def _send_frame_complete_notification(self):
pub.sendMessage('frame_complete', job_id=self.id, frame_number=self.current_frame)
def validate(self):
if not os.path.exists(self.input_path):
raise FileNotFoundError(f"Cannot find input path: {self.input_path}")
self.generate_subprocess()
def generate_subprocess(self):
"""Generate subprocess command arguments.
Returns:
List[str]: List of command line arguments for the subprocess.
Raises:
ValueError: If argument conflicts are detected.
"""
# Convert raw args from string if available and catch conflicts
generated_args = [str(x) for x in self.generate_worker_subprocess()]
generated_args_flags = [x for x in generated_args if x.startswith('-')]
if len(generated_args_flags) != len(set(generated_args_flags)):
msg = f"Cannot generate subprocess - Multiple arg conflicts detected: {generated_args}"
msg = "Cannot generate subprocess - Multiple arg conflicts detected"
logger.error(msg)
logger.debug(f"Generated args for subprocess: {generated_args}")
raise ValueError(msg)
return generated_args
def get_raw_args(self):
"""Parse raw command line arguments from args dictionary.
Returns:
Optional[List[str]]: Parsed raw arguments, or None if no raw args.
"""
raw_args_string = self.args.get('raw', '')
raw_args_string = self.args.get('raw', None)
raw_args = None
if raw_args_string:
import shlex
raw_args = shlex.split(raw_args_string)
return raw_args
def log_path(self):
"""Generate the log file path for this job.
def generate_worker_subprocess(self):
raise NotImplementedError("generate_worker_subprocess not implemented")
Returns:
str: Path to the log file.
"""
def log_path(self):
filename = (self.name or os.path.basename(self.input_path)) + '_' + \
self.date_created.strftime("%Y.%m.%d_%H.%M.%S") + '.log'
return os.path.join(os.path.dirname(self.input_path), filename)
def start(self):
"""Start the render job.
Validates input paths and engine availability, then starts the render thread.
"""
if self.status not in [RenderStatus.SCHEDULED, RenderStatus.NOT_STARTED, RenderStatus.CONFIGURING]:
if self.status not in [RenderStatus.SCHEDULED, RenderStatus.NOT_STARTED]:
logger.error(f"Trying to start job with status: {self.status}")
return
if not os.path.exists(self.input_path):
self.status = RenderStatus.ERROR
msg = f'Cannot find input path: {self.input_path}'
msg = 'Cannot find input path: {}'.format(self.input_path)
logger.error(msg)
self.errors.append(msg)
return
if not os.path.exists(self.engine_path):
if not os.path.exists(self.renderer_path):
self.status = RenderStatus.ERROR
msg = f'Cannot find render engine path for {self.engine.name()}'
msg = 'Cannot find render engine path for {}'.format(self.engine.name())
logger.error(msg)
self.errors.append(msg)
return
self.status = RenderStatus.RUNNING if not self.children else RenderStatus.WAITING_FOR_SUBJOBS
self.status = RenderStatus.RUNNING
self.start_time = datetime.now()
logger.info(f'Starting {self.engine.name()} {self.renderer_version} Render for {self.input_path} | '
f'Frame Count: {self.total_frames}')
self.__thread.start()
# handle multiple attempts at running subprocess
def __run__subprocess_cycle(self, log_file):
subprocess_cmds = self.generate_subprocess()
initial_file_count = len(self.file_list())
failed_attempts = 0
log_file.write(f"Running command: {subprocess_cmds}\n")
log_file.write('=' * 80 + '\n\n')
while True:
# Log attempt #
if failed_attempts:
if failed_attempts >= self.maximum_attempts:
err_msg = f"Maximum attempts exceeded ({self.maximum_attempts})"
logger.error(err_msg)
self.status = RenderStatus.ERROR
self.errors.append(err_msg)
return
else:
log_file.write(f'\n{"=" * 20} Attempt #{failed_attempts + 1} {"=" * 20}\n\n')
logger.warning(f"Restarting render - Attempt #{failed_attempts + 1}")
self.status = RenderStatus.RUNNING
return_code = self.__setup_and_run_process(log_file, subprocess_cmds)
message = f"{'=' * 50}\n\n{self.engine.name()} render ended with code {return_code} " \
f"after {self.time_elapsed()}\n\n"
log_file.write(message)
# don't try again if we've been cancelled
if self.status in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
return
# if file output hasn't increased, return as error, otherwise restart process.
file_count_has_increased = len(self.file_list()) > initial_file_count
if (self.status == RenderStatus.RUNNING) and file_count_has_increased and not return_code:
break
if return_code:
err_msg = f"{self.engine.name()} render failed with code {return_code}"
logger.error(err_msg)
self.errors.append(err_msg)
# handle instances where engine exits ok but doesnt generate files
if not return_code and not file_count_has_increased:
err_msg = (f"{self.engine.name()} render exited ok, but file count has not increased. "
f"Count is still {len(self.file_list())}")
log_file.write(f'Error: {err_msg}\n\n')
self.errors.append(err_msg)
# only count the attempt as failed if engine creates no output - reset counter on successful output
failed_attempts = 0 if file_count_has_increased else failed_attempts + 1
def __run__wait_for_subjobs(self, logfile):
from src.distributed_job_manager import DistributedJobManager
DistributedJobManager.wait_for_subjobs(parent_job=self)
@staticmethod
def log_and_print(message, log_file, level='info'):
if level == 'debug':
logger.debug(message)
elif level == 'error':
logger.error(message)
else:
logger.info(message)
log_file.write(f"{message}\n")
def __run(self):
def run(self):
# Setup logging
log_dir = os.path.dirname(self.log_path())
os.makedirs(log_dir, exist_ok=True)
with open(self.log_path(), "a") as log_file:
subprocess_cmds = self.generate_subprocess()
initial_file_count = len(self.file_list())
attempt_number = 0
self.log_and_print(f"{self.start_time.isoformat()} - Starting "
f"{self.engine.name()} {self.engine_version} render job for {self.name} "
f"({self.input_path})", log_file)
log_file.write(f"\n")
if not self.children:
self.__run__subprocess_cycle(log_file)
else:
self.__run__wait_for_subjobs(log_file)
with open(self.log_path(), "a") as f:
# Validate Output - End if missing frames
if self.status == RenderStatus.RUNNING:
file_list_length = len(self.file_list())
expected_list_length = (self.end_frame - self.start_frame + 1) if self.end_frame else 1
f.write(f"{self.start_time.isoformat()} - Starting {self.engine.name()} {self.renderer_version} "
f"render for {self.input_path}\n\n")
f.write(f"Running command: {subprocess_cmds}\n")
f.write('=' * 80 + '\n\n')
msg = f"Frames: Expected ({expected_list_length}) vs actual ({file_list_length}) for {self}"
self.log_and_print(msg, log_file, 'debug')
while True:
# Log attempt #
if attempt_number:
f.write(f'\n{"=" * 80} Attempt #{attempt_number} {"=" * 30}\n\n')
logger.warning(f"Restarting render - Attempt #{attempt_number}")
attempt_number += 1
if file_list_length not in (expected_list_length, 1):
msg = f"Missing frames: Expected ({expected_list_length}) vs actual ({file_list_length})"
self.log_and_print(msg, log_file, 'error')
self.errors.append(msg)
self.status = RenderStatus.ERROR
# todo: create new subjob to generate missing frames
# cleanup and close if cancelled / error
if self.status in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
self.end_time = datetime.now()
message = f"{self.engine.name()} render ended with status '{self.status.value}' " \
f"after {self.time_elapsed()}"
self.log_and_print(message, log_file)
log_file.close()
return
# Post Render Work
if not self.parent:
logger.debug(f"Starting post-processing work for {self}")
self.log_and_print(f"Starting post-processing work for {self}", log_file, 'debug')
self.post_processing()
self.log_and_print(f"Completed post-processing work for {self}", log_file, 'debug')
self.status = RenderStatus.COMPLETED
self.end_time = datetime.now()
message = f"Render {self.name} completed successfully after {self.time_elapsed()}"
self.log_and_print(message, log_file)
def __setup_and_run_process(self, f, subprocess_cmds):
def watchdog():
logger.debug(f'Starting process watchdog for {self} with {self.watchdog_timeout}s timeout')
while self.__process.poll() is None:
time_since_last_update = time.time() - self.__last_output_time
if time_since_last_update > self.watchdog_timeout:
logger.error(f"Process for {self} terminated due to exceeding timeout ({self.watchdog_timeout}s)")
self.__kill_process()
break
# logger.debug(f'Watchdog for {self} - Time since last update: {time_since_last_update}')
time.sleep(1)
logger.debug(f'Stopping process watchdog for {self}')
return_code = -1
watchdog_thread = threading.Thread(target=watchdog)
watchdog_thread.daemon = True
try:
# Start process and get updates
if os.name == 'posix': # linux / mac
# Start process and get updates
self.__process = subprocess.Popen(subprocess_cmds, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
universal_newlines=False, preexec_fn=os.setsid)
else: # windows
creationflags = subprocess.CREATE_NEW_PROCESS_GROUP | subprocess.CREATE_NO_WINDOW
self.__process = subprocess.Popen(subprocess_cmds, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
universal_newlines=False,
creationflags=creationflags)
universal_newlines=False)
# Start watchdog
self.__last_output_time = time.time()
watchdog_thread.start()
for c in io.TextIOWrapper(self.__process.stdout, encoding="utf-8"): # or another encoding
self.last_output = c.strip()
self.__last_output_time = time.time()
try:
for c in io.TextIOWrapper(self.__process.stdout, encoding="utf-8"): # or another encoding
f.write(c)
f.flush()
os.fsync(f.fileno())
except Exception as e:
logger.error(f"Error saving log to disk: {e}")
try:
self.last_output = c.strip()
self._parse_stdout(c.strip())
except Exception as e:
logger.error(f'Error parsing stdout: {e}')
f.write('\n')
f.write('\n')
# Check return codes and process
return_code = self.__process.wait()
except Exception as e:
message = f'Uncaught error running render process: {e}'
f.write(message)
logger.exception(message)
self.__kill_process()
# Check return codes and process
return_code = self.__process.wait()
self.end_time = datetime.now()
# let watchdog end before continuing - prevents multiple watchdogs running when process restarts
if watchdog_thread.is_alive():
watchdog_thread.join()
if self.status in [RenderStatus.CANCELLED, RenderStatus.ERROR]: # user cancelled
message = f"{self.engine.name()} render ended with status '{self.status}' " \
f"after {self.time_elapsed()}"
f.write(message)
return
return return_code
if not return_code:
message = f"{'=' * 50}\n\n{self.engine.name()} render completed successfully in {self.time_elapsed()}"
f.write(message)
break
def __kill_process(self):
try:
if self.__process.poll():
return
logger.debug(f"Trying to kill process {self.__process}")
self.__process.terminate()
self.__process.kill()
if os.name == 'posix': # linux / macos
os.killpg(os.getpgid(self.__process.pid), signal.SIGTERM)
os.killpg(os.getpgid(self.__process.pid), signal.SIGKILL)
else: # windows
parent = psutil.Process(self.__process.pid)
for child in parent.children(recursive=True):
child.kill()
self.__process.wait(timeout=5)
logger.debug(f"Process ended with status {self.__process.poll()}")
except (ProcessLookupError, AttributeError, psutil.NoSuchProcess):
pass
except Exception as e:
logger.error(f"Error stopping the process: {e}")
# Handle non-zero return codes
message = f"{'=' * 50}\n\n{self.engine.name()} render failed with code {return_code} " \
f"after {self.time_elapsed()}"
f.write(message)
self.errors.append(message)
# if file output hasn't increased, return as error, otherwise restart process.
if len(self.file_list()) <= initial_file_count:
self.status = RenderStatus.ERROR
return
if self.children:
from src.distributed_job_manager import DistributedJobManager
DistributedJobManager.wait_for_subjobs(local_job=self)
# Post Render Work
logger.debug("Starting post-processing work")
self.post_processing()
self.status = RenderStatus.COMPLETED
logger.info(f"Render {self.id}-{self.name} completed successfully after {self.time_elapsed()}")
def post_processing(self):
pass
def is_running(self):
"""Check if the render job is currently running.
Returns:
bool: True if the job is running, False otherwise.
"""
if hasattr(self, '__thread'):
if self.__thread:
return self.__thread.is_alive()
return False
def log_error(self, error_line, halt_render=False):
"""Log an error and optionally halt the render.
Args:
error_line: Error message to log.
halt_render: Whether to stop the render due to this error.
"""
logger.error(error_line)
self.errors.append(error_line)
if halt_render:
self.stop(is_error=True)
def stop(self, is_error=False):
"""Stop the render job.
Args:
is_error: Whether this stop is due to an error.
"""
logger.debug(f"Stopping {self}")
# cleanup status
if self.status in [RenderStatus.RUNNING, RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED,
RenderStatus.CONFIGURING]:
if hasattr(self, '__process'):
try:
process = psutil.Process(self.__process.pid)
for proc in process.children(recursive=True):
proc.kill()
process.kill()
except Exception as e:
logger.debug(f"Error stopping the process: {e}")
if self.status in [RenderStatus.RUNNING, RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED]:
if is_error:
err_message = self.errors[-1] if self.errors else 'Unknown error'
logger.error(f"Halting render due to error: {err_message}")
@@ -517,46 +275,28 @@ class BaseRenderWorker(Base):
else:
self.status = RenderStatus.CANCELLED
self.__kill_process()
if self.is_running(): # allow the log files to close
self.__thread.join(timeout=5)
def percent_complete(self):
return 0
def _parse_stdout(self, line):
raise NotImplementedError("_parse_stdout not implemented")
def time_elapsed(self):
"""Get elapsed time for this job.
Returns:
str: Formatted time elapsed string.
"""
return get_time_elapsed(self.start_time, self.end_time)
def file_list(self):
"""Get list of output files for this job.
Returns:
List[str]: List of output file paths.
"""
try:
job_dir = os.path.dirname(self.output_path)
file_list = [
os.path.join(job_dir, file)
for file in os.listdir(job_dir)
if not file.startswith('.') # Ignore hidden files
]
file_list = [os.path.join(job_dir, file) for file in os.listdir(job_dir)]
file_list.sort()
return file_list
except FileNotFoundError:
return []
def json(self):
"""Convert worker to JSON-serializable dictionary.
Returns:
Dict[str, Any]: Dictionary representation of worker data.
"""
job_dict = {
'id': self.id,
'name': self.name,
'hostname': self.hostname,
'input_path': self.input_path,
'output_path': self.output_path,
'priority': self.priority,
@@ -569,23 +309,20 @@ class BaseRenderWorker(Base):
'file_hash': self.file_hash,
'percent_complete': self.percent_complete(),
'file_count': len(self.file_list()),
'engine': self.engine_name,
'engine_version': self.engine_version,
'renderer': self.renderer,
'renderer_version': self.renderer_version,
'errors': getattr(self, 'errors', None),
'start_frame': self.start_frame,
'end_frame': self.end_frame,
'total_frames': self.total_frames,
'last_output': getattr(self, 'last_output', None),
'log_path': self.log_path(),
'args': self.args
'log_path': self.log_path()
}
# convert to json and back to auto-convert dates to iso format
def date_serializer(o):
"""Serialize datetime objects to ISO format."""
if isinstance(o, datetime):
return o.isoformat()
return None
json_convert = json.dumps(job_dict, default=date_serializer)
worker_json = json.loads(json_convert)
@@ -593,15 +330,6 @@ class BaseRenderWorker(Base):
def timecode_to_frames(timecode, frame_rate):
"""Convert timecode string to frame number.
Args:
timecode: Timecode in format HH:MM:SS:FF.
frame_rate: Frames per second.
Returns:
int: Frame number corresponding to timecode.
"""
e = [int(x) for x in timecode.split(':')]
seconds = (((e[0] * 60) + e[1] * 60) + e[2])
frames = (seconds * frame_rate) + e[-1] + 1

View File

@@ -2,402 +2,226 @@ import logging
import os
import shutil
import threading
import concurrent.futures
from pathlib import Path
from typing import Type, List, Dict, Any, Optional
from src.engines.core.base_engine import BaseRenderEngine
from src.engines.blender.blender_engine import Blender
from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
from src.utilities.misc_helper import current_system_os, current_system_cpu
from src.utilities.misc_helper import system_safe_path, current_system_os, current_system_cpu
logger = logging.getLogger()
ENGINE_CLASSES = [Blender, FFMPEG]
class EngineManager:
"""Class that manages different versions of installed render engines and handles fetching and downloading new versions,
if possible.
"""
engines_path: Optional[str] = None
download_tasks: List[Any] = []
engines_path = None
download_tasks = []
@staticmethod
def supported_engines() -> list[type[BaseRenderEngine]]:
"""Return list of supported engine classes.
Returns:
List[Type[BaseRenderEngine]]: List of available engine classes.
"""
return ENGINE_CLASSES
# --- Installed Engines ---
def supported_engines():
return [Blender, FFMPEG]
@classmethod
def engine_class_for_project_path(cls, path: str) -> Type[BaseRenderEngine]:
"""Find engine class that can handle the given project file.
Args:
path: Path to project file.
Returns:
Type[BaseRenderEngine]: Engine class that can handle the file.
"""
_, extension = os.path.splitext(path)
extension = extension.lower().strip('.')
for engine_class in cls.supported_engines():
engine = cls.get_latest_engine_instance(engine_class)
if extension in engine.supported_extensions():
return engine_class
undefined_renderer_support = [x for x in cls.supported_engines() if not cls.get_latest_engine_instance(x).supported_extensions()]
return undefined_renderer_support[0]
@classmethod
def engine_class_with_name(cls, engine_name: str) -> Optional[Type[BaseRenderEngine]]:
"""Find engine class by name.
Args:
engine_name: Name of engine to find.
Returns:
Optional[Type[BaseRenderEngine]]: Engine class if found, None otherwise.
"""
def engine_with_name(cls, engine_name):
for obj in cls.supported_engines():
if obj.name().lower() == engine_name.lower():
return obj
return None
@classmethod
def get_latest_engine_instance(cls, engine_class: Type[BaseRenderEngine]) -> BaseRenderEngine:
"""Create instance of latest installed engine version.
def all_engines(cls):
Args:
engine_class: Engine class to instantiate.
Returns:
BaseRenderEngine: Instance of engine with latest version.
"""
newest = cls.newest_installed_engine_data(engine_class.name())
engine = engine_class(newest["path"])
return engine
@classmethod
def get_installed_engine_data(cls, filter_name: Optional[str] = None, include_corrupt: bool = False,
ignore_system: bool = False) -> List[Dict[str, Any]]:
"""Get data about installed render engines.
Args:
filter_name: Optional engine name to filter by.
include_corrupt: Whether to include potentially corrupted installations.
ignore_system: Whether to ignore system-installed engines.
Returns:
List[Dict[str, Any]]: List of installed engine data.
Raises:
FileNotFoundError: If engines path is not set.
"""
if not cls.engines_path:
raise FileNotFoundError("Engine path is not set")
raise FileNotFoundError("Engines path must be set before requesting downloads")
# Parse downloaded engine directory
results = []
try:
all_items = os.listdir(cls.engines_path)
all_directories = [item for item in all_items if os.path.isdir(os.path.join(cls.engines_path, item))]
keys = ["engine", "version", "system_os", "cpu"] # Define keys for result dictionary
for directory in all_directories:
# Split directory name into segments
# Split the input string by dashes to get segments
segments = directory.split('-')
# Create a dictionary mapping keys to corresponding segments
# Create a dictionary with named keys
keys = ["engine", "version", "system_os", "cpu"]
result_dict = {keys[i]: segments[i] for i in range(min(len(keys), len(segments)))}
result_dict['type'] = 'managed'
# Initialize binary_name with engine name
# Figure out the binary name for the path
binary_name = result_dict['engine'].lower()
# Determine the correct binary name based on the engine and system_os
eng = cls.engine_class_with_name(result_dict['engine'])
binary_name = eng.binary_names.get(result_dict['system_os'], binary_name)
for eng in cls.supported_engines():
if eng.name().lower() == result_dict['engine']:
binary_name = eng.binary_names.get(result_dict['system_os'], binary_name)
# Find path to binary
path = None
for root, _, files in os.walk(system_safe_path(os.path.join(cls.engines_path, directory))):
if binary_name in files:
path = os.path.join(root, binary_name)
break
# Find the path to the binary file
search_root = Path(cls.engines_path) / directory
match = next((p for p in search_root.rglob(binary_name) if p.is_file()), None)
path = str(match) if match else None
result_dict['path'] = path
# fetch version number from binary - helps detect corrupted downloads - disabled due to perf issues
# binary_version = eng(path).version()
# if not binary_version:
# logger.warning(f"Possible corrupt {eng.name()} {result_dict['version']} install detected: {path}")
# if not include_corrupt:
# continue
# result_dict['version'] = binary_version or 'error'
# Add the result dictionary to results if it matches the filter_name or if no filter is applied
if not filter_name or filter_name == result_dict['engine']:
results.append(result_dict)
results.append(result_dict)
except FileNotFoundError as e:
logger.warning(f"Cannot find local engines download directory: {e}")
# add system installs to this list - use bg thread because it can be slow
def fetch_engine_details(eng, include_corrupt=False):
version = eng().version()
if not version and not include_corrupt:
return
return {
'engine': eng.name(),
'version': version or 'error',
'system_os': current_system_os(),
'cpu': current_system_cpu(),
'path': eng.default_engine_path(),
'type': 'system'
}
if not ignore_system:
with concurrent.futures.ThreadPoolExecutor() as executor:
futures = {
executor.submit(fetch_engine_details, eng, include_corrupt): eng.name()
for eng in cls.supported_engines()
if eng.default_engine_path() and (not filter_name or filter_name == eng.name())
}
for future in concurrent.futures.as_completed(futures):
result = future.result()
if result:
results.append(result)
# add system installs to this list
for eng in cls.supported_engines():
if eng.default_renderer_path():
results.append({'engine': eng.name(), 'version': eng().version(),
'system_os': current_system_os(),
'cpu': current_system_cpu(),
'path': eng.default_renderer_path(), 'type': 'system'})
return results
# --- Check for Updates ---
@classmethod
def all_versions_for_engine(cls, engine):
return [x for x in cls.all_engines() if x['engine'] == engine]
@classmethod
def update_all_engines(cls) -> None:
"""Check for and download updates for all downloadable engines."""
for engine in cls.downloadable_engines():
update_available = cls.is_engine_update_available(engine)
if update_available:
update_available['name'] = engine.name()
cls.download_engine(engine.name(), update_available['version'], background=True)
@classmethod
def all_version_data_for_engine(cls, engine_name:str, include_corrupt=False, ignore_system=False) -> list:
"""Get all version data for a specific engine.
Args:
engine_name: Name of engine to query.
include_corrupt: Whether to include corrupt installations.
ignore_system: Whether to ignore system installations.
Returns:
list: Sorted list of engine version data (newest first).
"""
versions = cls.get_installed_engine_data(filter_name=engine_name, include_corrupt=include_corrupt, ignore_system=ignore_system)
sorted_versions = sorted(versions, key=lambda x: x['version'], reverse=True)
return sorted_versions
@classmethod
def newest_installed_engine_data(cls, engine_name:str, system_os=None, cpu=None, ignore_system=None) -> list:
"""Get newest installed engine data for specific platform.
Args:
engine_name: Name of engine to query.
system_os: Operating system to filter by (defaults to current).
cpu: CPU architecture to filter by (defaults to current).
ignore_system: Whether to ignore system installations.
Returns:
list: Newest engine data or empty list if not found.
"""
def newest_engine_version(cls, engine, system_os=None, cpu=None):
system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu()
try:
filtered = [x for x in cls.all_version_data_for_engine(engine_name, ignore_system=ignore_system)
if x['system_os'] == system_os and x['cpu'] == cpu]
return filtered[0]
filtered = [x for x in cls.all_engines() if x['engine'] == engine and x['system_os'] == system_os and x['cpu'] == cpu]
versions = sorted(filtered, key=lambda x: x['version'], reverse=True)
return versions[0]
except IndexError:
logger.error(f"Cannot find newest engine version for {engine_name}-{system_os}-{cpu}")
return []
logger.error(f"Cannot find newest engine version for {engine}-{system_os}-{cpu}")
return None
@classmethod
def is_version_installed(cls, engine_name:str, version:str, system_os=None, cpu=None, ignore_system=False):
"""Check if specific engine version is installed.
Args:
engine_name: Name of engine to check.
version: Version string to check.
system_os: Operating system to check (defaults to current).
cpu: CPU architecture to check (defaults to current).
ignore_system: Whether to ignore system installations.
Returns:
Engine data if found, False otherwise.
"""
def is_version_downloaded(cls, engine, version, system_os=None, cpu=None):
system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu()
filtered = [x for x in cls.get_installed_engine_data(filter_name=engine_name, ignore_system=ignore_system) if
x['system_os'] == system_os and x['cpu'] == cpu and x['version'] == version]
filtered = [x for x in cls.all_engines() if
x['engine'] == engine and x['system_os'] == system_os and x['cpu'] == cpu and x['version'] == version]
return filtered[0] if filtered else False
@classmethod
def version_is_available_to_download(cls, engine_name:str, version, system_os=None, cpu=None):
def version_is_available_to_download(cls, engine, version, system_os=None, cpu=None):
try:
downloader = cls.engine_class_with_name(engine_name).downloader()
downloader = cls.engine_with_name(engine).downloader()
return downloader.version_is_available_to_download(version=version, system_os=system_os, cpu=cpu)
except Exception as e:
logger.debug(f"Exception in version_is_available_to_download: {e}")
return None
@classmethod
def find_most_recent_version(cls, engine_name:str, system_os=None, cpu=None, lts_only=False) -> dict:
def find_most_recent_version(cls, engine=None, system_os=None, cpu=None, lts_only=False):
try:
downloader = cls.engine_class_with_name(engine_name).downloader()
downloader = cls.engine_with_name(engine).downloader()
return downloader.find_most_recent_version(system_os=system_os, cpu=cpu)
except Exception as e:
logger.debug(f"Exception in find_most_recent_version: {e}")
return {}
@classmethod
def is_engine_update_available(cls, engine_class: Type[BaseRenderEngine], ignore_system_installs=False):
logger.debug(f"Checking for updates to {engine_class.name()}")
latest_version = engine_class.downloader().find_most_recent_version()
if not latest_version:
logger.warning(f"Could not find most recent version of {engine_class.name()} to download")
return None
version_num = latest_version.get('version')
if cls.is_version_installed(engine_class.name(), version_num, ignore_system=ignore_system_installs):
logger.debug(f"Latest version of {engine_class.name()} ({version_num}) already downloaded")
return None
return latest_version
# --- Downloads ---
@classmethod
def downloadable_engines(cls):
"""Get list of engines that support downloading.
Returns:
List[Type[BaseRenderEngine]]: Engines with downloader capability.
"""
return [engine for engine in cls.supported_engines() if hasattr(engine, "downloader") and engine.downloader()]
@classmethod
def get_existing_download_task(cls, engine_name, version, system_os=None, cpu=None):
def is_already_downloading(cls, engine, version, system_os=None, cpu=None):
for task in cls.download_tasks:
task_parts = task.name.split('-')
task_engine, task_version, task_system_os, task_cpu = task_parts[:4]
if engine_name == task_engine and version == task_version:
if engine == task_engine and version == task_version:
if system_os in (task_system_os, None) and cpu in (task_cpu, None):
return task
return None
return False
@classmethod
def download_engine(cls, engine_name, version, system_os=None, cpu=None, background=False, ignore_system=False):
def download_engine(cls, engine, version, system_os=None, cpu=None, background=False):
def download_engine_task(engine, version, system_os=None, cpu=None):
existing_download = cls.is_version_downloaded(engine, version, system_os, cpu)
if existing_download:
logger.info(f"Requested download of {engine} {version}, but local copy already exists")
return existing_download
engine_to_download = cls.engine_class_with_name(engine_name)
existing_task = cls.get_existing_download_task(engine_name, version, system_os, cpu)
# Get the appropriate downloader class based on the engine type
cls.engine_with_name(engine).downloader().download_engine(version, download_location=cls.engines_path,
system_os=system_os, cpu=cpu, timeout=300)
engine_to_download = cls.engine_with_name(engine)
existing_task = cls.is_already_downloading(engine, version, system_os, cpu)
if existing_task:
logger.debug(f"Already downloading {engine_name} {version}")
logger.debug(f"Already downloading {engine} {version}")
if not background:
existing_task.join() # If download task exists, wait until it's done downloading
return None
existing_task.join() # If download task exists, wait until its done downloading
return
elif not engine_to_download.downloader():
logger.warning("No valid downloader for this engine. Please update this software manually.")
return None
return
elif not cls.engines_path:
raise FileNotFoundError("Engines path must be set before requesting downloads")
thread = EngineDownloadWorker(engine_name, version, system_os, cpu)
thread = threading.Thread(target=download_engine_task, args=(engine, version, system_os, cpu),
name=f'{engine}-{version}-{system_os}-{cpu}')
cls.download_tasks.append(thread)
thread.start()
if background:
return thread
else:
thread.join()
found_engine = cls.is_version_downloaded(engine, version, system_os, cpu) # Check that engine downloaded
if not found_engine:
logger.error(f"Error downloading {engine}")
return found_engine
thread.join()
found_engine = cls.is_version_installed(engine_name, version, system_os, cpu, ignore_system) # Check that engine downloaded
if not found_engine:
logger.error(f"Error downloading {engine_name}")
return found_engine
@classmethod
def delete_engine_download(cls, engine_name, version, system_os=None, cpu=None):
logger.info(f"Requested deletion of engine: {engine_name}-{version}")
def delete_engine_download(cls, engine, version, system_os=None, cpu=None):
logger.info(f"Requested deletion of engine: {engine}-{version}")
found = cls.is_version_installed(engine_name, version, system_os, cpu)
found = cls.is_version_downloaded(engine, version, system_os, cpu)
if found and found['type'] == 'managed': # don't delete system installs
# find the root directory of the engine executable
root_dir_name = '-'.join([engine_name, version, found['system_os'], found['cpu']])
root_dir_name = '-'.join([engine, version, found['system_os'], found['cpu']])
remove_path = os.path.join(found['path'].split(root_dir_name)[0], root_dir_name)
# delete the file path
logger.info(f"Deleting engine at path: {remove_path}")
shutil.rmtree(remove_path, ignore_errors=False)
logger.info(f"Engine {engine_name}-{version}-{found['system_os']}-{found['cpu']} successfully deleted")
logger.info(f"Engine {engine}-{version}-{found['system_os']}-{found['cpu']} successfully deleted")
return True
elif found: # these are managed by the system / user. Don't delete these.
logger.error(f'Cannot delete requested {engine_name} {version}. Managed externally.')
logger.error(f'Cannot delete requested {engine} {version}. Managed externally.')
else:
logger.error(f"Cannot find engine: {engine_name}-{version}")
logger.error(f"Cannot find engine: {engine}-{version}")
return False
# --- Background Tasks ---
@classmethod
def update_all_engines(cls):
def engine_update_task(engine):
logger.debug(f"Checking for updates to {engine.name()}")
latest_version = engine.downloader().find_most_recent_version()
if latest_version:
logger.debug(f"Latest version of {engine.name()} available: {latest_version.get('version')}")
if not cls.is_version_downloaded(engine.name(), latest_version.get('version')):
logger.info(f"Downloading latest version of {engine.name()}...")
cls.download_engine(engine=engine.name(), version=latest_version['version'], background=True)
else:
logger.warning(f"Unable to get check for updates for {engine.name()}")
logger.info(f"Checking for updates for render engines...")
threads = []
for engine in cls.supported_engines():
if engine.downloader():
thread = threading.Thread(target=engine_update_task, args=(engine,))
threads.append(thread)
thread.start()
@classmethod
def active_downloads(cls) -> list:
"""Get list of currently active download tasks.
def create_worker(cls, renderer, input_path, output_path, engine_version=None, args=None, parent=None, name=None):
Returns:
list: List of active EngineDownloadWorker threads.
"""
return [x for x in cls.download_tasks if x.is_alive()]
@classmethod
def create_worker(cls, engine_name: str, input_path: Path, output_path: Path, engine_version=None, args=None, parent=None, name=None):
"""
Create and return a worker instance for a specific engine.
This resolves the appropriate engine binary/path for the requested engine and version,
downloading the engine if necessary (when a specific version is requested and not found
locally). The returned worker is constructed with string paths for compatibility with
worker implementations that expect `str` rather than `Path`.
Args:
engine_name: The engine name used to resolve an engine class and its worker.
input_path: Path to the input file/folder for the worker to process.
output_path: Path where the worker should write output.
engine_version: Optional engine version to use. If `None` or `'latest'`, the newest
installed version is used. If a specific version is provided and not installed,
the engine will be downloaded.
args: Optional arguments passed through to the worker (engine-specific).
parent: Optional Qt/GUI parent object passed through to the worker constructor.
name: Optional name/label passed through to the worker constructor.
Returns:
An instance of the engine-specific worker class.
Raises:
FileNotFoundError: If no versions of the engine are installed, if the requested
version cannot be found or downloaded, or if the engine path cannot be resolved.
"""
worker_class = cls.engine_class_with_name(engine_name).worker_class()
worker_class = cls.engine_with_name(renderer).worker_class()
# check to make sure we have versions installed
all_versions = cls.all_version_data_for_engine(engine_name)
all_versions = EngineManager.all_versions_for_engine(renderer)
if not all_versions:
raise FileNotFoundError(f"Cannot find any installed '{engine_name}' engines")
raise FileNotFoundError(f"Cannot find any installed {renderer} engines")
# Find the path to the requested engine version or use default
engine_path = None
if engine_version and engine_version != 'latest':
engine_path = None if engine_version else all_versions[0]['path']
if engine_version:
for ver in all_versions:
if ver['version'] == engine_version:
engine_path = ver['path']
@@ -405,80 +229,27 @@ class EngineManager:
# Download the required engine if not found locally
if not engine_path:
download_result = cls.download_engine(engine_name, engine_version)
download_result = EngineManager.download_engine(renderer, engine_version)
if not download_result:
raise FileNotFoundError(f"Cannot download requested version: {engine_name} {engine_version}")
raise FileNotFoundError(f"Cannot download requested version: {renderer} {engine_version}")
engine_path = download_result['path']
logger.info("Engine downloaded. Creating worker.")
else:
logger.debug(f"Using latest engine version ({all_versions[0]['version']})")
engine_path = all_versions[0]['path']
if not engine_path:
raise FileNotFoundError(f"Cannot find requested engine version {engine_version}")
return worker_class(input_path=str(input_path), output_path=str(output_path), engine_path=engine_path, args=args,
return worker_class(input_path=input_path, output_path=output_path, engine_path=engine_path, args=args,
parent=parent, name=name)
class EngineDownloadWorker(threading.Thread):
"""A thread worker for downloading a specific version of a rendering engine.
This class handles the process of downloading a rendering engine in a separate thread,
ensuring that the download process does not block the main application.
Attributes:
engine (str): The name of the rendering engine to download.
version (str): The version of the rendering engine to download.
system_os (str, optional): The operating system for which to download the engine. Defaults to current OS type.
cpu (str, optional): Requested CPU architecture. Defaults to system CPU type.
"""
def __init__(self, engine, version, system_os=None, cpu=None):
"""Initialize download worker for specific engine version.
Args:
engine: Name of engine to download.
version: Version of engine to download.
system_os: Target operating system (defaults to current).
cpu: Target CPU architecture (defaults to current).
"""
super().__init__()
self.engine = engine
self.version = version
self.system_os = system_os
self.cpu = cpu
self.percent_complete = 0
def _update_progress(self, current_progress):
"""Update download progress.
Args:
current_progress: Current download progress percentage (0-100).
"""
self.percent_complete = current_progress
def run(self):
"""Execute the download process.
Checks if engine version already exists, then downloads if not found.
Handles cleanup and error reporting.
"""
try:
existing_download = EngineManager.is_version_installed(self.engine, self.version, self.system_os, self.cpu,
ignore_system=True)
if existing_download:
logger.info(f"Requested download of {self.engine} {self.version}, but local copy already exists")
return existing_download
# Get the appropriate downloader class based on the engine type
downloader = EngineManager.engine_class_with_name(self.engine).downloader()
downloader.download_engine( self.version, download_location=EngineManager.engines_path,
system_os=self.system_os, cpu=self.cpu, timeout=300, progress_callback=self._update_progress)
except Exception as e:
logger.error(f"Error in download worker: {e}")
finally:
# remove itself from the downloader list
EngineManager.download_tasks.remove(self)
@classmethod
def engine_for_project_path(cls, path):
name, extension = os.path.splitext(path)
extension = extension.lower().strip('.')
for engine in cls.supported_engines():
if extension in engine.supported_extensions():
return engine
undefined_renderer_support = [x for x in cls.supported_engines() if not x.supported_extensions()]
return undefined_renderer_support[0]
if __name__ == '__main__':
@@ -488,4 +259,4 @@ if __name__ == '__main__':
# EngineManager.delete_engine_download('blender', '3.2.1', 'macos', 'a')
EngineManager.engines_path = "/Users/brettwilliams/zordon-uploads/engines"
# print(EngineManager.is_version_downloaded("ffmpeg", "6.0"))
print(EngineManager.get_installed_engine_data())
print(EngineManager.all_engines())

View File

@@ -90,14 +90,14 @@ class FFMPEGDownloader(EngineDownloader):
return releases
@classmethod
def all_versions(cls, system_os=None, cpu=None):
def __all_versions(cls, system_os=None, cpu=None):
system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu()
versions_per_os = {'linux': cls.__get_linux_versions, 'macos': cls.__get_macos_versions,
'windows': cls.__get_windows_versions}
if not versions_per_os.get(system_os):
logger.error(f"Cannot find version list for {system_os}")
return None
return
results = []
all_versions = versions_per_os[system_os]()
@@ -131,28 +131,28 @@ class FFMPEGDownloader(EngineDownloader):
try:
system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu()
return cls.all_versions(system_os, cpu)[0]
except (IndexError, requests.exceptions.RequestException) as e:
logger.error(f"Cannot get most recent version of ffmpeg: {e}")
return cls.__all_versions(system_os, cpu)[0]
except (IndexError, requests.exceptions.RequestException):
logger.error(f"Cannot get most recent version of ffmpeg")
return {}
@classmethod
def version_is_available_to_download(cls, version, system_os=None, cpu=None):
for ver in cls.all_versions(system_os, cpu):
for ver in cls.__all_versions(system_os, cpu):
if ver['version'] == version:
return ver
return None
@classmethod
def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120, progress_callback=None):
def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120):
system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu()
# Verify requested version is available
found_version = [item for item in cls.all_versions(system_os, cpu) if item['version'] == version]
found_version = [item for item in cls.__all_versions(system_os, cpu) if item['version'] == version]
if not found_version:
logger.error(f"Cannot find FFMPEG version {version} for {system_os} and {cpu}")
return None
return
# Platform specific naming cleanup
remote_url = cls.__get_remote_url_for_version(version=version, system_os=system_os, cpu=cpu)
@@ -162,8 +162,7 @@ class FFMPEGDownloader(EngineDownloader):
# Download and extract
try:
logger.info(f"Requesting download of ffmpeg-{version}-{system_os}-{cpu}")
cls.download_and_extract_app(remote_url=remote_url, download_location=download_location, timeout=timeout,
progress_callback=progress_callback)
cls.download_and_extract_app(remote_url=remote_url, download_location=download_location, timeout=timeout)
# naming cleanup to match existing naming convention
output_path = os.path.join(download_location, f'ffmpeg-{version}-{system_os}-{cpu}')
@@ -183,5 +182,4 @@ if __name__ == "__main__":
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
# print(FFMPEGDownloader.download_engine('6.0', '/Users/brett/zordon-uploads/engines/'))
# print(FFMPEGDownloader.find_most_recent_version(system_os='linux'))
print(FFMPEGDownloader.download_engine(version='6.0', download_location='/Users/brett/zordon-uploads/engines/',
system_os='linux', cpu='x64'))
print(FFMPEGDownloader.download_engine(version='6.0', download_location='/Users/brett/zordon-uploads/engines/', system_os='linux', cpu='x64'))

View File

@@ -1,13 +1,11 @@
import json
import os
import re
from src.engines.core.base_engine import *
_creationflags = subprocess.CREATE_NO_WINDOW if platform.system() == 'Windows' else 0
class FFMPEG(BaseRenderEngine):
binary_names = {'linux': 'ffmpeg', 'windows': 'ffmpeg.exe', 'macos': 'ffmpeg'}
@staticmethod
@@ -20,13 +18,11 @@ class FFMPEG(BaseRenderEngine):
from src.engines.ffmpeg.ffmpeg_worker import FFMPEGRenderWorker
return FFMPEGRenderWorker
def ui_options(self):
return []
def supported_extensions(self):
help_text = (subprocess.check_output([self.engine_path(), '-h', 'full'], stderr=subprocess.STDOUT,
creationflags=_creationflags).decode('utf-8'))
found = re.findall(r'extensions that .* is allowed to access \(default "(.*)"', help_text)
@classmethod
def supported_extensions(cls):
help_text = (subprocess.check_output([cls().renderer_path(), '-h', 'full'], stderr=subprocess.STDOUT)
.decode('utf-8'))
found = re.findall('extensions that .* is allowed to access \(default "(.*)"', help_text)
found_extensions = set()
for match in found:
found_extensions.update(match.split(','))
@@ -35,9 +31,9 @@ class FFMPEG(BaseRenderEngine):
def version(self):
version = None
try:
ver_out = subprocess.check_output([self.engine_path(), '-version'], timeout=SUBPROCESS_TIMEOUT,
creationflags=_creationflags).decode('utf-8')
match = re.match(r".*version\s*([\w.*]+)\W*", ver_out)
ver_out = subprocess.check_output([self.renderer_path(), '-version'],
timeout=SUBPROCESS_TIMEOUT).decode('utf-8')
match = re.match(".*version\s*(\S+)\s*Copyright", ver_out)
if match:
version = match.groups()[0]
except Exception as e:
@@ -45,24 +41,14 @@ class FFMPEG(BaseRenderEngine):
return version
def get_project_info(self, project_path, timeout=10):
"""Run ffprobe and parse the output as JSON"""
try:
# resolve ffprobe path
engine_dir = os.path.dirname(self.engine_path())
ffprobe_path = os.path.join(engine_dir, 'ffprobe')
if self.engine_path().endswith('.exe'):
ffprobe_path += '.exe'
if not os.path.exists(ffprobe_path): # fallback to system install (if available)
ffprobe_path = 'ffprobe'
# run ffprobe
# Run ffprobe and parse the output as JSON
cmd = [
ffprobe_path, '-v', 'quiet', '-print_format', 'json',
'ffprobe', '-v', 'quiet', '-print_format', 'json',
'-show_streams', '-select_streams', 'v', project_path
]
output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, text=True,
creationflags=_creationflags)
video_info = json.loads(output)
result = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
video_info = json.loads(result.stdout)
# Extract the necessary information
video_stream = video_info['streams'][0]
@@ -87,13 +73,13 @@ class FFMPEG(BaseRenderEngine):
}
except Exception as e:
print(f"Failed to get FFMPEG project info: {e}")
print(f"An error occurred: {e}")
return None
def get_encoders(self):
raw_stdout = subprocess.check_output([self.engine_path(), '-encoders'], stderr=subprocess.DEVNULL,
timeout=SUBPROCESS_TIMEOUT, creationflags=_creationflags).decode('utf-8')
pattern = r'(?P<type>[VASFXBD.]{6})\s+(?P<name>\S{2,})\s+(?P<description>.*)'
raw_stdout = subprocess.check_output([self.renderer_path(), '-encoders'], stderr=subprocess.DEVNULL,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8')
pattern = '(?P<type>[VASFXBD.]{6})\s+(?P<name>\S{2,})\s+(?P<description>.*)'
encoders = [m.groupdict() for m in re.finditer(pattern, raw_stdout)]
return encoders
@@ -103,10 +89,9 @@ class FFMPEG(BaseRenderEngine):
def get_all_formats(self):
try:
formats_raw = subprocess.check_output([self.engine_path(), '-formats'], stderr=subprocess.DEVNULL,
timeout=SUBPROCESS_TIMEOUT,
creationflags=_creationflags).decode('utf-8')
pattern = r'(?P<type>[DE]{1,2})\s+(?P<id>\S{2,})\s+(?P<name>.*)'
formats_raw = subprocess.check_output([self.renderer_path(), '-formats'], stderr=subprocess.DEVNULL,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8')
pattern = '(?P<type>[DE]{1,2})\s+(?P<id>\S{2,})\s+(?P<name>.*)'
all_formats = [m.groupdict() for m in re.finditer(pattern, formats_raw)]
return all_formats
except Exception as e:
@@ -117,8 +102,7 @@ class FFMPEG(BaseRenderEngine):
# Extract the common extension using regex
muxer_flag = 'muxer' if 'E' in ffmpeg_format['type'] else 'demuxer'
format_detail_raw = subprocess.check_output(
[self.engine_path(), '-hide_banner', '-h', f"{muxer_flag}={ffmpeg_format['id']}"],
creationflags=_creationflags).decode('utf-8')
[self.renderer_path(), '-hide_banner', '-h', f"{muxer_flag}={ffmpeg_format['id']}"]).decode('utf-8')
pattern = r"Common extensions: (\w+)"
common_extensions = re.findall(pattern, format_detail_raw)
found_extensions = []
@@ -130,18 +114,17 @@ class FFMPEG(BaseRenderEngine):
return [x['id'] for x in self.get_all_formats() if 'E' in x['type'].upper()]
def get_frame_count(self, path_to_file):
raw_stdout = subprocess.check_output([self.engine_path(), '-i', path_to_file, '-map', '0:v:0', '-c', 'copy',
'-f', 'null', '-'], stderr=subprocess.STDOUT,
timeout=SUBPROCESS_TIMEOUT, creationflags=_creationflags).decode('utf-8')
raw_stdout = subprocess.check_output([self.renderer_path(), '-i', path_to_file, '-map', '0:v:0', '-c', 'copy',
'-f', 'null', '-'], stderr=subprocess.STDOUT,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8')
match = re.findall(r'frame=\s*(\d+)', raw_stdout)
if match:
frame_number = int(match[-1])
return frame_number
return -1
def get_arguments(self):
help_text = (subprocess.check_output([self.engine_path(), '-h', 'long'], stderr=subprocess.STDOUT,
creationflags=_creationflags).decode('utf-8'))
help_text = (subprocess.check_output([self.renderer_path(), '-h', 'long'], stderr=subprocess.STDOUT)
.decode('utf-8'))
lines = help_text.splitlines()
options = {}
@@ -168,4 +151,4 @@ class FFMPEG(BaseRenderEngine):
if __name__ == "__main__":
print(FFMPEG().get_all_formats())
print(FFMPEG().get_all_formats())

View File

@@ -1,5 +1,6 @@
#!/usr/bin/env python3
import re
import subprocess
from src.engines.core.base_worker import BaseRenderWorker
from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
@@ -16,11 +17,11 @@ class FFMPEGRenderWorker(BaseRenderWorker):
def generate_worker_subprocess(self):
cmd = [self.engine_path, '-y', '-stats', '-i', self.input_path]
cmd = [self.engine.default_renderer_path(), '-y', '-stats', '-i', self.input_path]
# Resize frame
if self.args.get('resolution', None):
cmd.extend(['-vf', f"scale={self.args['resolution'][0]}:{self.args['resolution'][1]}"])
if self.args.get('x_resolution', None) and self.args.get('y_resolution', None):
cmd.extend(['-vf', f"scale={self.args['x_resolution']}:{self.args['y_resolution']}"])
# Convert raw args from string if available
raw_args = self.args.get('raw', None)
@@ -28,7 +29,7 @@ class FFMPEGRenderWorker(BaseRenderWorker):
cmd.extend(raw_args.split(' '))
# Close with output path
cmd.extend(['-max_muxing_queue_size', '1024', self.output_path])
cmd.append(self.output_path)
return cmd
def percent_complete(self):

70
src/init.py Normal file
View File

@@ -0,0 +1,70 @@
''' app/init.py '''
import logging
import os
import sys
import threading
from collections import deque
from PyQt6.QtCore import QObject, pyqtSignal
from PyQt6.QtWidgets import QApplication
from .render_queue import RenderQueue
from .ui.main_window import MainWindow
from src.api.api_server import start_server
from src.utilities.config import Config
from src.utilities.misc_helper import system_safe_path
def run() -> int:
"""
Initializes the application and runs it.
Returns:
int: The exit status code.
"""
# Load Config YAML
config_dir = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))), 'config')
Config.load_config(system_safe_path(os.path.join(config_dir, 'config.yaml')))
logging.basicConfig(format='%(asctime)s: %(levelname)s: %(module)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S',
level=Config.server_log_level.upper())
app: QApplication = QApplication(sys.argv)
# Start server in background
background_server = threading.Thread(target=start_server)
background_server.daemon = True
background_server.start()
# Setup logging for console ui
buffer_handler = BufferingHandler()
buffer_handler.setFormatter(logging.getLogger().handlers[0].formatter)
logger = logging.getLogger()
logger.addHandler(buffer_handler)
window: MainWindow = MainWindow()
window.buffer_handler = buffer_handler
window.show()
return_code = app.exec()
RenderQueue.prepare_for_shutdown()
return sys.exit(return_code)
class BufferingHandler(logging.Handler, QObject):
new_record = pyqtSignal(str)
def __init__(self, capacity=100):
logging.Handler.__init__(self)
QObject.__init__(self)
self.buffer = deque(maxlen=capacity) # Define a buffer with a fixed capacity
def emit(self, record):
msg = self.format(record)
self.buffer.append(msg) # Add message to the buffer
self.new_record.emit(msg) # Emit signal
def get_buffer(self):
return list(self.buffer) # Return a copy of the buffer

View File

@@ -1,82 +1,41 @@
import logging
from collections import Counter
import os
from datetime import datetime
from pathlib import Path
from typing import List, Dict, Any, Optional
from pubsub import pub
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm.exc import DetachedInstanceError
from src.engines.core.base_worker import Base, BaseRenderWorker
from src.utilities.status_utils import RenderStatus
from src.engines.engine_manager import EngineManager
from src.engines.core.base_worker import Base
logger = logging.getLogger()
class JobNotFoundError(Exception):
def __init__(self, job_id: str, *args: Any) -> None:
def __init__(self, job_id, *args):
super().__init__(args)
self.job_id = job_id
def __str__(self) -> str:
return f"Cannot find job with ID: {self.job_id}"
class RenderQueue:
engine: Optional[create_engine] = None
session: Optional[sessionmaker] = None
job_queue: List[BaseRenderWorker] = []
maximum_renderer_instances: Dict[str, int] = {'blender': 1, 'aerender': 1, 'ffmpeg': 4}
last_saved_counts: Dict[str, int] = {}
is_running: bool = False
engine = None
session = None
job_queue = []
maximum_renderer_instances = {'blender': 1, 'aerender': 1, 'ffmpeg': 4}
last_saved_counts = {}
# --------------------------------------------
# Render Queue Evaluation:
# --------------------------------------------
def __init__(self):
pass
@classmethod
def start(cls):
"""Start evaluating the render queue"""
logger.debug("Starting render queue updates")
cls.is_running = True
cls.evaluate_queue()
@classmethod
def evaluate_queue(cls):
try:
not_started = cls.jobs_with_status(RenderStatus.NOT_STARTED, priority_sorted=True)
for job in not_started:
if cls.is_available_for_job(job.engine_name, job.priority):
cls.start_job(job)
scheduled = cls.jobs_with_status(RenderStatus.SCHEDULED, priority_sorted=True)
for job in scheduled:
if job.scheduled_start <= datetime.now():
logger.debug(f"Starting scheduled job: {job}")
cls.start_job(job)
if cls.last_saved_counts != cls.job_counts():
cls.save_state()
except DetachedInstanceError:
pass
@classmethod
def __local_job_status_changed(cls, job_id, old_status, new_status):
render_job = RenderQueue.job_with_id(job_id, none_ok=True)
if render_job and cls.is_running: # ignore changes from render jobs not in the queue yet
logger.debug(f"RenderQueue detected job {job_id} has changed from {old_status} -> {new_status}")
RenderQueue.evaluate_queue()
@classmethod
def stop(cls):
logger.debug("Stopping render queue updates")
cls.is_running = False
# --------------------------------------------
# Fetch Jobs:
# --------------------------------------------
def add_to_render_queue(cls, render_job, force_start=False):
logger.debug('Adding priority {} job to render queue: {}'.format(render_job.priority, render_job))
cls.job_queue.append(render_job)
if force_start:
cls.start_job(render_job)
cls.session.add(render_job)
cls.save_state()
@classmethod
def all_jobs(cls):
@@ -107,24 +66,21 @@ class RenderQueue:
return found_job
@classmethod
def job_counts(cls):
job_counts = {}
for job_status in RenderStatus:
job_counts[job_status.value] = len(cls.jobs_with_status(job_status))
return job_counts
# --------------------------------------------
# Startup / Shutdown:
# --------------------------------------------
def clear_history(cls):
to_remove = [x for x in cls.all_jobs() if x.status in [RenderStatus.CANCELLED,
RenderStatus.COMPLETED, RenderStatus.ERROR]]
for job_to_remove in to_remove:
cls.delete_job(job_to_remove)
cls.save_state()
@classmethod
def load_state(cls, database_directory: Path):
def load_state(cls, database_directory):
if not cls.engine:
cls.engine = create_engine(f"sqlite:///{database_directory / 'database.db'}")
cls.engine = create_engine(f"sqlite:///{os.path.join(database_directory, 'database.db')}")
Base.metadata.create_all(cls.engine)
cls.session = sessionmaker(bind=cls.engine)()
from src.engines.core.base_worker import BaseRenderWorker
cls.job_queue = cls.session.query(BaseRenderWorker).all()
pub.subscribe(cls.__local_job_status_changed, 'status_change')
@classmethod
def save_state(cls):
@@ -133,74 +89,69 @@ class RenderQueue:
@classmethod
def prepare_for_shutdown(cls):
logger.debug("Closing session")
cls.stop()
running_jobs = cls.jobs_with_status(RenderStatus.RUNNING) # cancel all running jobs
_ = [cls.cancel_job(job) for job in running_jobs]
[cls.cancel_job(job) for job in running_jobs]
cls.save_state()
cls.session.close()
# --------------------------------------------
# Renderer Availability:
# --------------------------------------------
@classmethod
def renderer_instances(cls):
all_instances = [x.engine_name for x in cls.running_jobs()]
return Counter(all_instances)
@classmethod
def is_available_for_job(cls, renderer, priority=2):
if not EngineManager.all_versions_for_engine(renderer):
return False
instances = cls.renderer_instances()
higher_priority_jobs = [x for x in cls.running_jobs() if x.priority < priority]
max_allowed_instances = cls.maximum_renderer_instances.get(renderer, 1)
maxed_out_instances = renderer in instances.keys() and instances[renderer] >= max_allowed_instances
return not maxed_out_instances and not higher_priority_jobs
# --------------------------------------------
# Job Lifecycle Management:
# --------------------------------------------
@classmethod
def add_to_render_queue(cls, render_job, force_start=False):
logger.info(f"Adding job to render queue: {render_job}")
cls.job_queue.append(render_job)
if cls.is_running and force_start and render_job.status in (RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED):
cls.start_job(render_job)
cls.session.add(render_job)
cls.save_state()
if cls.is_running:
cls.evaluate_queue()
def evaluate_queue(cls):
not_started = cls.jobs_with_status(RenderStatus.NOT_STARTED, priority_sorted=True)
for job in not_started:
if cls.is_available_for_job(job.renderer, job.priority):
cls.start_job(job)
scheduled = cls.jobs_with_status(RenderStatus.SCHEDULED, priority_sorted=True)
for job in scheduled:
if job.scheduled_start <= datetime.now():
logger.debug(f"Starting scheduled job: {job}")
cls.start_job(job)
if cls.last_saved_counts != cls.job_counts():
cls.save_state()
@classmethod
def start_job(cls, job):
logger.info(f'Starting job: {job}')
logger.info(f'Starting render: {job.name} - Priority {job.priority}')
job.start()
cls.save_state()
@classmethod
def cancel_job(cls, job):
logger.info(f'Cancelling job: {job}')
logger.info(f'Cancelling job ID: {job.id}')
job.stop()
return job.status == RenderStatus.CANCELLED
@classmethod
def delete_job(cls, job):
logger.info(f"Deleting job: {job}")
logger.info(f"Deleting job ID: {job.id}")
job.stop()
cls.job_queue.remove(job)
cls.session.delete(job)
cls.save_state()
return True
# --------------------------------------------
# Miscellaneous:
# --------------------------------------------
@classmethod
def renderer_instances(cls):
from collections import Counter
all_instances = [x.renderer for x in cls.running_jobs()]
return Counter(all_instances)
@classmethod
def clear_history(cls):
to_remove = [x for x in cls.all_jobs() if x.status in [RenderStatus.CANCELLED,
RenderStatus.COMPLETED, RenderStatus.ERROR]]
for job_to_remove in to_remove:
cls.delete_job(job_to_remove)
cls.save_state()
def job_counts(cls):
job_counts = {}
for job_status in RenderStatus:
job_counts[job_status.value] = len(cls.jobs_with_status(job_status))
return job_counts

View File

@@ -1,74 +0,0 @@
import os
import sys
from PyQt6.QtCore import Qt
from PyQt6.QtGui import QPixmap
from PyQt6.QtWidgets import QDialog, QVBoxLayout, QLabel, QDialogButtonBox, QHBoxLayout
from src.version import *
class AboutDialog(QDialog):
def __init__(self):
super().__init__()
self.setWindowTitle(f"About {APP_NAME}")
# Create the layout
layout = QVBoxLayout()
# App Icon
icon_name = 'Server.png' # todo: temp icon - replace with final later
icon_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))),
'resources', icon_name)
icon_label = QLabel(self)
icon_pixmap = QPixmap(icon_path)
icon_label.setPixmap(icon_pixmap)
icon_layout = QHBoxLayout()
icon_layout.addStretch()
icon_layout.addWidget(icon_label)
icon_layout.addStretch()
layout.addLayout(icon_layout)
# Application name
name_label = QLabel(f"<h2>{APP_NAME}</h2>")
layout.addWidget(name_label)
# Description
description_label = QLabel(APP_DESCRIPTION)
layout.addWidget(description_label)
# Version
version_label = QLabel(f"<strong>Version:</strong> {APP_VERSION}")
layout.addWidget(version_label)
# Contributors
contributors_label = QLabel(f"Copyright © {APP_COPYRIGHT_YEAR} {APP_AUTHOR}")
layout.addWidget(contributors_label)
# License
license_label = QLabel(f"Released under {APP_LICENSE}")
layout.addWidget(license_label)
# Add an "OK" button to close the dialog
button_box = QDialogButtonBox(QDialogButtonBox.StandardButton.Ok)
button_box.accepted.connect(self.accept)
layout.addWidget(button_box)
# Set the layout for the dialog
self.setLayout(layout)
# Make the dialog non-resizable
self.setWindowFlags(self.windowFlags() & ~Qt.WindowType.WindowContextHelpButtonHint)
self.setFixedSize(self.sizeHint())
if __name__ == '__main__':
# lazy load GUI frameworks
from PyQt6.QtWidgets import QApplication
# load application
app: QApplication = QApplication(sys.argv)
window: AboutDialog = AboutDialog()
window.show()
app.exec()

535
src/ui/add_job.py Normal file
View File

@@ -0,0 +1,535 @@
import copy
import os.path
import pathlib
import socket
import threading
import psutil
from PyQt6.QtCore import QThread, pyqtSignal, Qt, pyqtSlot
from PyQt6.QtWidgets import (
QApplication, QWidget, QVBoxLayout, QHBoxLayout, QLabel, QLineEdit, QPushButton, QFileDialog, QSpinBox, QComboBox,
QGroupBox, QCheckBox, QProgressBar, QPlainTextEdit, QDoubleSpinBox, QMessageBox, QListWidget, QListWidgetItem
)
from requests import Response
from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.ui.engine_help_viewer import EngineHelpViewer
from src.utilities.zeroconf_server import ZeroconfServer
class NewRenderJobForm(QWidget):
def __init__(self, project_path=None):
super().__init__()
self.project_path = project_path
# UI
self.current_engine_options = None
self.file_format_combo = None
self.renderer_options_layout = None
self.cameras_list = None
self.cameras_group = None
self.renderer_version_combo = None
self.worker_thread = None
self.msg_box = None
self.engine_help_viewer = None
self.raw_args = None
self.submit_progress_label = None
self.submit_progress = None
self.renderer_type = None
self.process_label = None
self.process_progress_bar = None
self.splitjobs_same_os = None
self.enable_splitjobs = None
self.server_input = None
self.submit_button = None
self.notes_input = None
self.priority_input = None
self.end_frame_input = None
self.start_frame_input = None
self.output_path_input = None
self.scene_file_input = None
self.scene_file_browse_button = None
self.job_name_input = None
# Job / Server Data
self.server_proxy = RenderServerProxy(socket.gethostname())
self.renderer_info = None
self.project_info = None
# Setup
self.setWindowTitle("New Job")
self.setup_ui()
self.setup_project()
# get renderer info in bg thread
t = threading.Thread(target=self.update_renderer_info)
t.start()
self.show()
def setup_ui(self):
# Main Layout
main_layout = QVBoxLayout(self)
# Scene File Group
scene_file_group = QGroupBox("Project")
scene_file_layout = QVBoxLayout(scene_file_group)
scene_file_picker_layout = QHBoxLayout()
self.scene_file_input = QLineEdit()
self.scene_file_input.setText(self.project_path)
self.scene_file_browse_button = QPushButton("Browse...")
self.scene_file_browse_button.clicked.connect(self.browse_scene_file)
scene_file_picker_layout.addWidget(self.scene_file_input)
scene_file_picker_layout.addWidget(self.scene_file_browse_button)
scene_file_layout.addLayout(scene_file_picker_layout)
# progress bar
progress_layout = QHBoxLayout()
self.process_progress_bar = QProgressBar()
self.process_progress_bar.setMinimum(0)
self.process_progress_bar.setMaximum(0)
self.process_progress_bar.setHidden(True)
self.process_label = QLabel("Processing")
self.process_label.setHidden(True)
progress_layout.addWidget(self.process_label)
progress_layout.addWidget(self.process_progress_bar)
scene_file_layout.addLayout(progress_layout)
main_layout.addWidget(scene_file_group)
# Server Group
# Server List
self.server_group = QGroupBox("Server")
server_layout = QVBoxLayout(self.server_group)
server_list_layout = QHBoxLayout()
server_list_layout.setSpacing(0)
self.server_input = QComboBox()
server_list_layout.addWidget(QLabel("Hostname:"), 1)
server_list_layout.addWidget(self.server_input, 3)
server_layout.addLayout(server_list_layout)
main_layout.addWidget(self.server_group)
self.update_server_list()
# Priority
priority_layout = QHBoxLayout()
priority_layout.addWidget(QLabel("Priority:"), 1)
self.priority_input = QComboBox()
self.priority_input.addItems(["High", "Medium", "Low"])
self.priority_input.setCurrentIndex(1)
priority_layout.addWidget(self.priority_input, 3)
server_layout.addLayout(priority_layout)
# Splitjobs
self.enable_splitjobs = QCheckBox("Automatically split render across multiple servers")
self.enable_splitjobs.setEnabled(True)
server_layout.addWidget(self.enable_splitjobs)
self.splitjobs_same_os = QCheckBox("Only render on same OS")
self.splitjobs_same_os.setEnabled(True)
server_layout.addWidget(self.splitjobs_same_os)
# Output Settings Group
self.output_settings_group = QGroupBox("Output Settings")
output_settings_layout = QVBoxLayout(self.output_settings_group)
# output path
output_path_layout = QHBoxLayout()
output_path_layout.addWidget(QLabel("Render name:"))
self.output_path_input = QLineEdit()
output_path_layout.addWidget(self.output_path_input)
output_settings_layout.addLayout(output_path_layout)
# file format
file_format_layout = QHBoxLayout()
file_format_layout.addWidget(QLabel("Format:"))
self.file_format_combo = QComboBox()
file_format_layout.addWidget(self.file_format_combo)
output_settings_layout.addLayout(file_format_layout)
# frame range
frame_range_layout = QHBoxLayout(self.output_settings_group)
self.start_frame_input = QSpinBox()
self.start_frame_input.setRange(1, 99999)
self.end_frame_input = QSpinBox()
self.end_frame_input.setRange(1, 99999)
frame_range_layout.addWidget(QLabel("Frames:"))
frame_range_layout.addWidget(self.start_frame_input)
frame_range_layout.addWidget(QLabel("to"))
frame_range_layout.addWidget(self.end_frame_input)
output_settings_layout.addLayout(frame_range_layout)
# resolution
resolution_layout = QHBoxLayout(self.output_settings_group)
self.resolution_x_input = QSpinBox()
self.resolution_x_input.setRange(1, 9999) # Assuming max resolution width 9999
self.resolution_x_input.setValue(1920)
self.resolution_y_input = QSpinBox()
self.resolution_y_input.setRange(1, 9999) # Assuming max resolution height 9999
self.resolution_y_input.setValue(1080)
self.frame_rate_input = QDoubleSpinBox()
self.frame_rate_input.setRange(1, 9999) # Assuming max resolution width 9999
self.frame_rate_input.setDecimals(3)
self.frame_rate_input.setValue(23.976)
resolution_layout.addWidget(QLabel("Resolution:"))
resolution_layout.addWidget(self.resolution_x_input)
resolution_layout.addWidget(QLabel("x"))
resolution_layout.addWidget(self.resolution_y_input)
resolution_layout.addWidget(QLabel("@"))
resolution_layout.addWidget(self.frame_rate_input)
resolution_layout.addWidget(QLabel("fps"))
output_settings_layout.addLayout(resolution_layout)
# add group to layout
main_layout.addWidget(self.output_settings_group)
# Renderer Group
self.renderer_group = QGroupBox("Renderer Settings")
renderer_group_layout = QVBoxLayout(self.renderer_group)
renderer_layout = QHBoxLayout()
renderer_layout.addWidget(QLabel("Renderer:"))
self.renderer_type = QComboBox()
self.renderer_type.currentIndexChanged.connect(self.renderer_changed)
renderer_layout.addWidget(self.renderer_type)
# Version
renderer_layout.addWidget(QLabel("Version:"))
self.renderer_version_combo = QComboBox()
renderer_layout.addWidget(self.renderer_version_combo)
renderer_group_layout.addLayout(renderer_layout)
# dynamic options
self.renderer_options_layout = QVBoxLayout()
renderer_group_layout.addLayout(self.renderer_options_layout)
# Raw Args
raw_args_layout = QHBoxLayout(self.renderer_group)
raw_args_layout.addWidget(QLabel("Raw Args:"))
self.raw_args = QLineEdit()
raw_args_layout.addWidget(self.raw_args)
args_help_button = QPushButton("?")
args_help_button.clicked.connect(self.args_help_button_clicked)
raw_args_layout.addWidget(args_help_button)
renderer_group_layout.addLayout(raw_args_layout)
main_layout.addWidget(self.renderer_group)
# Cameras Group
self.cameras_group = QGroupBox("Cameras")
cameras_layout = QVBoxLayout(self.cameras_group)
self.cameras_list = QListWidget()
self.cameras_group.setHidden(True)
cameras_layout.addWidget(self.cameras_list)
main_layout.addWidget(self.cameras_group)
# Notes Group
self.notes_group = QGroupBox("Additional Notes")
notes_layout = QVBoxLayout(self.notes_group)
self.notes_input = QPlainTextEdit()
notes_layout.addWidget(self.notes_input)
main_layout.addWidget(self.notes_group)
# Submit Button
self.submit_button = QPushButton("Submit Job")
self.submit_button.clicked.connect(self.submit_job)
main_layout.addWidget(self.submit_button)
self.submit_progress = QProgressBar()
self.submit_progress.setMinimum(0)
self.submit_progress.setMaximum(0)
self.submit_progress.setHidden(True)
main_layout.addWidget(self.submit_progress)
self.submit_progress_label = QLabel("Submitting...")
self.submit_progress_label.setHidden(True)
main_layout.addWidget(self.submit_progress_label)
self.toggle_renderer_enablement(False)
def update_renderer_info(self):
# get the renderer info and add them all to the ui
self.renderer_info = self.server_proxy.get_renderer_info()
self.renderer_type.addItems(self.renderer_info.keys())
# select the best renderer for the file type
engine = EngineManager.engine_for_project_path(self.project_path)
self.renderer_type.setCurrentText(engine.name().lower())
# refresh ui
self.renderer_changed()
def renderer_changed(self):
# load the version numbers
current_renderer = self.renderer_type.currentText().lower() or self.renderer_type.itemText(0)
self.renderer_version_combo.clear()
self.file_format_combo.clear()
if current_renderer:
renderer_vers = [version_info['version'] for version_info in self.renderer_info[current_renderer]['versions']]
self.renderer_version_combo.addItems(renderer_vers)
self.file_format_combo.addItems(self.renderer_info[current_renderer]['supported_export_formats'])
def update_server_list(self):
clients = ZeroconfServer.found_hostnames()
self.server_input.clear()
self.server_input.addItems(clients)
def browse_scene_file(self):
file_name, _ = QFileDialog.getOpenFileName(self, "Select Scene File")
if file_name:
self.scene_file_input.setText(file_name)
self.setup_project()
def setup_project(self):
# UI stuff on main thread
self.process_progress_bar.setHidden(False)
self.process_label.setHidden(False)
self.toggle_renderer_enablement(False)
output_name, _ = os.path.splitext(os.path.basename(self.scene_file_input.text()))
output_name = output_name.replace(' ', '_')
self.output_path_input.setText(output_name)
file_name = self.scene_file_input.text()
# setup bg worker
self.worker_thread = GetProjectInfoWorker(window=self, project_path=file_name)
self.worker_thread.message_signal.connect(self.post_get_project_info_update)
self.worker_thread.start()
def browse_output_path(self):
directory = QFileDialog.getExistingDirectory(self, "Select Output Directory")
if directory:
self.output_path_input.setText(directory)
def args_help_button_clicked(self):
url = (f'http://{self.server_proxy.hostname}:{self.server_proxy.port}/api/renderer/'
f'{self.renderer_type.currentText()}/help')
self.engine_help_viewer = EngineHelpViewer(url)
self.engine_help_viewer.show()
# -------- Update --------
def post_get_project_info_update(self):
"""Called by the GetProjectInfoWorker - Do not call directly."""
try:
# Set the best renderer we can find
input_path = self.scene_file_input.text()
engine = EngineManager.engine_for_project_path(input_path)
engine_index = self.renderer_type.findText(engine.name().lower())
if engine_index >= 0:
self.renderer_type.setCurrentIndex(engine_index)
else:
self.renderer_type.setCurrentIndex(0) #todo: find out why we don't have renderer info yet
# not ideal but if we don't have the renderer info we have to pick something
self.output_path_input.setText(os.path.basename(input_path))
# cleanup progress UI
self.process_progress_bar.setHidden(True)
self.process_label.setHidden(True)
self.toggle_renderer_enablement(True)
# Load scene data
self.start_frame_input.setValue(self.project_info.get('frame_start'))
self.end_frame_input.setValue(self.project_info.get('frame_end'))
self.resolution_x_input.setValue(self.project_info.get('resolution_x'))
self.resolution_y_input.setValue(self.project_info.get('resolution_y'))
self.frame_rate_input.setValue(self.project_info.get('fps'))
# Cameras
self.cameras_list.clear()
if self.project_info.get('cameras'):
self.cameras_group.setHidden(False)
found_active = False
for camera in self.project_info['cameras']:
# create the list items and make them checkable
item = QListWidgetItem(f"{camera['name']} - {camera['lens']}mm")
item.setFlags(item.flags() | Qt.ItemFlag.ItemIsUserCheckable)
is_checked = camera['is_active'] or len(self.project_info['cameras']) == 1
found_active = found_active or is_checked
item.setCheckState(Qt.CheckState.Checked if is_checked else Qt.CheckState.Unchecked)
self.cameras_list.addItem(item)
if not found_active:
self.cameras_list.item(0).setCheckState(Qt.CheckState.Checked)
else:
self.cameras_group.setHidden(True)
# Dynamic Engine Options
clear_layout(self.renderer_options_layout) # clear old options
# dynamically populate option list
self.current_engine_options = engine().get_options()
for option in self.current_engine_options:
h_layout = QHBoxLayout()
label = QLabel(option['name'].capitalize() + ':')
h_layout.addWidget(label)
if option.get('options'):
combo_box = QComboBox()
for opt in option['options']:
combo_box.addItem(opt)
h_layout.addWidget(combo_box)
else:
text_box = QLineEdit()
h_layout.addWidget(text_box)
self.renderer_options_layout.addLayout(h_layout)
except AttributeError as e:
pass
def toggle_renderer_enablement(self, enabled=False):
"""Toggle on/off all the render settings"""
self.server_group.setHidden(not enabled)
self.output_settings_group.setHidden(not enabled)
self.renderer_group.setHidden(not enabled)
self.notes_group.setHidden(not enabled)
if not enabled:
self.cameras_group.setHidden(True)
self.submit_button.setEnabled(enabled)
def after_job_submission(self, result):
# UI cleanup
self.submit_progress.setMaximum(0)
self.submit_button.setHidden(False)
self.submit_progress.setHidden(True)
self.submit_progress_label.setHidden(True)
self.process_progress_bar.setHidden(True)
self.process_label.setHidden(True)
self.toggle_renderer_enablement(True)
self.msg_box = QMessageBox()
if result.ok:
self.msg_box.setStandardButtons(QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No)
self.msg_box.setIcon(QMessageBox.Icon.Information)
self.msg_box.setText("Job successfully submitted to server. Submit another?")
self.msg_box.setWindowTitle("Success")
x = self.msg_box.exec()
if x == QMessageBox.StandardButton.No:
self.close()
else:
self.msg_box.setStandardButtons(QMessageBox.StandardButton.Ok)
self.msg_box.setIcon(QMessageBox.Icon.Critical)
self.msg_box.setText(result.text or "Unknown error")
self.msg_box.setWindowTitle("Error")
self.msg_box.exec()
# -------- Submit Job Calls --------
def submit_job(self):
# Pre-worker UI
self.submit_progress.setHidden(False)
self.submit_progress_label.setHidden(False)
self.submit_button.setHidden(True)
self.submit_progress.setMaximum(0)
# submit job in background thread
self.worker_thread = SubmitWorker(window=self)
self.worker_thread.update_ui_signal.connect(self.update_submit_progress)
self.worker_thread.message_signal.connect(self.after_job_submission)
self.worker_thread.start()
@pyqtSlot(str, str)
def update_submit_progress(self, hostname, percent):
# Update the UI here. This slot will be executed in the main thread
self.submit_progress_label.setText(f"Transferring to {hostname} - {percent}%")
self.submit_progress.setMaximum(100)
self.submit_progress.setValue(int(percent))
class SubmitWorker(QThread):
"""Worker class called to submit all the jobs to the server and update the UI accordingly"""
message_signal = pyqtSignal(Response)
update_ui_signal = pyqtSignal(str, str)
def __init__(self, window):
super().__init__()
self.window = window
def run(self):
def create_callback(encoder):
encoder_len = encoder.len
def callback(monitor):
percent = f"{monitor.bytes_read / encoder_len * 100:.0f}"
self.update_ui_signal.emit(hostname, percent)
return callback
hostname = self.window.server_input.currentText()
job_json = {'owner': psutil.Process().username() + '@' + socket.gethostname(),
'renderer': self.window.renderer_type.currentText().lower(),
'renderer_version': self.window.renderer_version_combo.currentText(),
'args': {'raw': self.window.raw_args.text()},
'output_path': self.window.output_path_input.text(),
'start_frame': self.window.start_frame_input.value(),
'end_frame': self.window.end_frame_input.value(),
'priority': self.window.priority_input.currentIndex() + 1,
'notes': self.window.notes_input.toPlainText(),
'enable_split_jobs': self.window.enable_splitjobs.isChecked(),
'split_jobs_same_os': self.window.splitjobs_same_os.isChecked()}
# get the dynamic args
for i in range(self.window.renderer_options_layout.count()):
item = self.window.renderer_options_layout.itemAt(i)
layout = item.layout() # get the layout
for x in range(layout.count()):
z = layout.itemAt(x)
widget = z.widget()
if isinstance(widget, QComboBox):
job_json['args'][self.window.current_engine_options[i]['name']] = widget.currentText()
elif isinstance(widget, QLineEdit):
job_json['args'][self.window.current_engine_options[i]['name']] = widget.text()
# determine if any cameras are checked
selected_cameras = []
if self.window.cameras_list.count() and not self.window.cameras_group.isHidden():
for index in range(self.window.cameras_list.count()):
item = self.window.cameras_list.item(index)
if item.checkState() == Qt.CheckState.Checked:
selected_cameras.append(item.text().rsplit('-', 1)[0].strip()) # cleanup to just camera name
# process cameras into nested format
input_path = self.window.scene_file_input.text()
if selected_cameras:
job_list = []
for cam in selected_cameras:
job_copy = copy.deepcopy(job_json)
job_copy['args']['camera'] = cam
job_copy['name'] = pathlib.Path(input_path).stem.replace(' ', '_') + "-" + cam.replace(' ', '')
job_list.append(job_copy)
else:
job_list = [job_json]
# presubmission tasks
engine = EngineManager.engine_with_name(self.window.renderer_type.currentText().lower())
input_path = engine().perform_presubmission_tasks(input_path)
# submit
result = self.window.server_proxy.post_job_to_server(file_path=input_path, job_list=job_list,
callback=create_callback)
self.message_signal.emit(result)
class GetProjectInfoWorker(QThread):
"""Worker class called to retrieve information about a project file on a background thread and update the UI"""
message_signal = pyqtSignal()
def __init__(self, window, project_path):
super().__init__()
self.window = window
self.project_path = project_path
def run(self):
engine = EngineManager.engine_for_project_path(self.project_path)
self.window.project_info = engine().get_project_info(self.project_path)
self.message_signal.emit()
def clear_layout(layout):
if layout is not None:
# Go through the layout's items in reverse order
for i in reversed(range(layout.count())):
# Take the item at the current position
item = layout.takeAt(i)
# Check if the item is a widget
if item.widget():
# Remove the widget and delete it
widget_to_remove = item.widget()
widget_to_remove.setParent(None)
widget_to_remove.deleteLater()
elif item.layout():
# If the item is a sub-layout, clear its contents recursively
clear_layout(item.layout())
# Then delete the layout
item.layout().deleteLater()
# Run the application
if __name__ == '__main__':
app = QApplication([])
window = NewRenderJobForm()
app.exec()

View File

@@ -1,669 +0,0 @@
import socket
from pathlib import Path
import psutil
from PyQt6.QtCore import QThread, pyqtSignal, Qt, pyqtSlot
from PyQt6.QtWidgets import (
QApplication, QWidget, QVBoxLayout, QHBoxLayout, QLabel, QLineEdit, QPushButton, QFileDialog, QSpinBox, QComboBox,
QGroupBox, QCheckBox, QProgressBar, QPlainTextEdit, QDoubleSpinBox, QMessageBox, QListWidget, QListWidgetItem,
QTabWidget
)
from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.ui.engine_help_window import EngineHelpViewer
from src.utilities.misc_helper import COMMON_RESOLUTIONS, COMMON_FRAME_RATES
from src.utilities.zeroconf_server import ZeroconfServer
class NewRenderJobForm(QWidget):
def __init__(self, project_path=None):
super().__init__()
self.resolution_options_list = None
self.resolution_x_input = None
self.resolution_y_input = None
self.fps_options_list = None
self.fps_input = None
self.engine_group = None
self.notes_group = None
self.output_settings_group = None
self.project_path = project_path
# UI
self.project_group = None
self.load_file_group = None
self.current_engine_options = None
self.file_format_combo = None
self.engine_options_layout = None
self.cameras_list = None
self.cameras_group = None
self.engine_version_combo = None
self.worker_thread = None
self.msg_box = None
self.engine_help_viewer = None
self.raw_args = None
self.submit_progress_label = None
self.submit_progress = None
self.engine_type = None
self.process_label = None
self.process_progress_bar = None
self.splitjobs_same_os = None
self.enable_splitjobs = None
self.server_input = None
self.submit_button = None
self.notes_input = None
self.priority_input = None
self.end_frame_input = None
self.start_frame_input = None
self.job_name_input = None
self.scene_file_input = None
self.scene_file_browse_button = None
self.tabs = None
# Job / Server Data
self.server_proxy = RenderServerProxy(socket.gethostname())
self.project_info = None
self.installed_engines = {}
self.preferred_engine = None
# Setup
self.setWindowTitle("New Job")
self.setup_ui()
self.setup_project()
self.show()
def setup_ui(self):
# Main widget layout
main_layout = QVBoxLayout(self)
# Tabs
self.tabs = QTabWidget()
# ==================== Loading Section (outside tabs) ====================
self.load_file_group = QGroupBox("Loading")
load_file_layout = QVBoxLayout(self.load_file_group)
progress_layout = QHBoxLayout()
self.process_label = QLabel("Processing")
self.process_progress_bar = QProgressBar()
self.process_progress_bar.setMinimum(0)
self.process_progress_bar.setMaximum(0) # Indeterminate
progress_layout.addWidget(self.process_label)
progress_layout.addWidget(self.process_progress_bar)
load_file_layout.addLayout(progress_layout)
# Scene File
job_overview_group = QGroupBox("Project File")
file_group_layout = QVBoxLayout(job_overview_group)
# Job Name
job_name_layout = QHBoxLayout()
job_name_layout.addWidget(QLabel("Job name:"))
self.job_name_input = QLineEdit()
job_name_layout.addWidget(self.job_name_input)
self.engine_type = QComboBox()
job_name_layout.addWidget(self.engine_type)
file_group_layout.addLayout(job_name_layout)
# Job File
scene_file_picker_layout = QHBoxLayout()
scene_file_picker_layout.addWidget(QLabel("File:"))
self.scene_file_input = QLineEdit()
self.scene_file_input.setText(self.project_path)
self.scene_file_browse_button = QPushButton("Browse...")
self.scene_file_browse_button.clicked.connect(self.browse_scene_file)
scene_file_picker_layout.addWidget(self.scene_file_input)
scene_file_picker_layout.addWidget(self.scene_file_browse_button)
file_group_layout.addLayout(scene_file_picker_layout)
main_layout.addWidget(job_overview_group)
main_layout.addWidget(self.load_file_group)
main_layout.addWidget(self.tabs)
# ==================== Tab 1: Job Settings ====================
self.project_group = QWidget()
project_layout = QVBoxLayout(self.project_group) # Fixed: proper parent
# Server / Hostname
server_list_layout = QHBoxLayout()
server_list_layout.addWidget(QLabel("Render Target:"))
self.server_input = QComboBox()
server_list_layout.addWidget(self.server_input)
project_layout.addLayout(server_list_layout)
# Priority
priority_layout = QHBoxLayout()
priority_layout.addWidget(QLabel("Priority:"))
self.priority_input = QComboBox()
self.priority_input.addItems(["High", "Medium", "Low"])
self.priority_input.setCurrentIndex(1)
priority_layout.addWidget(self.priority_input)
project_layout.addLayout(priority_layout)
# Split Jobs Options
self.enable_splitjobs = QCheckBox("Automatically split render across multiple servers")
project_layout.addWidget(self.enable_splitjobs)
self.splitjobs_same_os = QCheckBox("Only render on same OS")
project_layout.addWidget(self.splitjobs_same_os)
project_layout.addStretch() # Push everything up
# ==================== Tab 2: Output Settings ====================
self.output_settings_group = QWidget()
output_settings_layout = QVBoxLayout(self.output_settings_group)
# File Format
format_group = QGroupBox("Format / Range")
output_settings_layout.addWidget(format_group)
format_group_layout = QVBoxLayout()
format_group.setLayout(format_group_layout)
file_format_layout = QHBoxLayout()
file_format_layout.addWidget(QLabel("Format:"))
self.file_format_combo = QComboBox()
self.file_format_combo.setFixedWidth(200)
file_format_layout.addWidget(self.file_format_combo)
file_format_layout.addStretch()
format_group_layout.addLayout(file_format_layout)
# Frame Range
frame_range_layout = QHBoxLayout()
frame_range_layout.addWidget(QLabel("Frames:"))
self.start_frame_input = QSpinBox()
self.start_frame_input.setRange(1, 99999)
self.start_frame_input.setFixedWidth(80)
self.end_frame_input = QSpinBox()
self.end_frame_input.setRange(1, 99999)
self.end_frame_input.setFixedWidth(80)
frame_range_layout.addWidget(self.start_frame_input)
frame_range_layout.addWidget(QLabel("to"))
frame_range_layout.addWidget(self.end_frame_input)
frame_range_layout.addStretch()
format_group_layout.addLayout(frame_range_layout)
# --- Resolution & FPS Group ---
resolution_group = QGroupBox("Resolution / Frame Rate")
output_settings_layout.addWidget(resolution_group)
resolution_group_layout = QVBoxLayout()
resolution_group.setLayout(resolution_group_layout)
# Resolution
resolution_layout = QHBoxLayout()
self.resolution_options_list = QComboBox()
self.resolution_options_list.setFixedWidth(200)
self.resolution_options_list.addItem("Original Size")
for res in COMMON_RESOLUTIONS:
self.resolution_options_list.addItem(res)
self.resolution_options_list.currentIndexChanged.connect(self._resolution_preset_changed)
resolution_layout.addWidget(self.resolution_options_list)
resolution_group_layout.addLayout(resolution_layout)
self.resolution_x_input = QSpinBox()
self.resolution_x_input.setRange(1, 9999)
self.resolution_x_input.setValue(1920)
self.resolution_x_input.setFixedWidth(80)
resolution_layout.addWidget(self.resolution_x_input)
self.resolution_y_input = QSpinBox()
self.resolution_y_input.setRange(1, 9999)
self.resolution_y_input.setValue(1080)
self.resolution_y_input.setFixedWidth(80)
resolution_layout.addWidget(QLabel("x"))
resolution_layout.addWidget(self.resolution_y_input)
resolution_layout.addStretch()
fps_layout = QHBoxLayout()
self.fps_options_list = QComboBox()
self.fps_options_list.setFixedWidth(200)
self.fps_options_list.addItem("Original FPS")
for fps_option in COMMON_FRAME_RATES:
self.fps_options_list.addItem(fps_option)
self.fps_options_list.currentIndexChanged.connect(self._fps_preset_changed)
fps_layout.addWidget(self.fps_options_list)
self.fps_input = QDoubleSpinBox()
self.fps_input.setDecimals(3)
self.fps_input.setRange(1.0, 999.0)
self.fps_input.setValue(23.976)
self.fps_input.setFixedWidth(80)
fps_layout.addWidget(self.fps_input)
fps_layout.addWidget(QLabel("fps"))
fps_layout.addStretch()
resolution_group_layout.addLayout(fps_layout)
output_settings_layout.addStretch()
# ==================== Tab 3: Engine Settings ====================
self.engine_group = QWidget()
engine_group_layout = QVBoxLayout(self.engine_group)
engine_layout = QHBoxLayout()
engine_layout.addWidget(QLabel("Engine Version:"))
self.engine_version_combo = QComboBox()
self.engine_version_combo.addItem('latest')
engine_layout.addWidget(self.engine_version_combo)
engine_group_layout.addLayout(engine_layout)
# Dynamic engine options
self.engine_options_layout = QVBoxLayout()
engine_group_layout.addLayout(self.engine_options_layout)
# Raw Args
raw_args_layout = QHBoxLayout()
raw_args_layout.addWidget(QLabel("Raw Args:"))
self.raw_args = QLineEdit()
raw_args_layout.addWidget(self.raw_args)
args_help_button = QPushButton("?")
args_help_button.clicked.connect(self.args_help_button_clicked)
raw_args_layout.addWidget(args_help_button)
engine_group_layout.addLayout(raw_args_layout)
engine_group_layout.addStretch()
# ==================== Tab 4: Cameras ====================
self.cameras_group = QWidget()
cameras_layout = QVBoxLayout(self.cameras_group)
self.cameras_list = QListWidget()
self.cameras_list.itemChanged.connect(self.update_job_count)
cameras_layout.addWidget(self.cameras_list)
# ==================== Tab 5: Misc / Notes ====================
self.notes_group = QWidget()
notes_layout = QVBoxLayout(self.notes_group)
self.notes_input = QPlainTextEdit()
notes_layout.addWidget(self.notes_input)
# == Create Tabs
self.tabs.addTab(self.project_group, "Job Settings")
self.tabs.addTab(self.output_settings_group, "Output Settings")
self.tabs.addTab(self.engine_group, "Engine Settings")
self.tabs.addTab(self.cameras_group, "Cameras")
self.tabs.addTab(self.notes_group, "Notes")
self.update_server_list()
index = self.tabs.indexOf(self.cameras_group)
if index != -1:
self.tabs.setTabEnabled(index, False)
# ==================== Submit Section (outside tabs) ====================
self.submit_button = QPushButton("Submit Job")
self.submit_button.clicked.connect(self.submit_job)
main_layout.addWidget(self.submit_button)
self.submit_progress = QProgressBar()
self.submit_progress.setMinimum(0)
self.submit_progress.setMaximum(0)
self.submit_progress.setHidden(True)
main_layout.addWidget(self.submit_progress)
self.submit_progress_label = QLabel("Submitting...")
self.submit_progress_label.setHidden(True)
main_layout.addWidget(self.submit_progress_label)
# Initial engine state
self.toggle_engine_enablement(False)
self.tabs.setCurrentIndex(0)
def update_job_count(self, changed_item=None):
checked = 1
if self.cameras_group.enabled:
checked = 0
total = self.cameras_list.count()
for i in range(total):
item = self.cameras_list.item(i)
if item.checkState() == Qt.CheckState.Checked:
checked += 1
message = f"Submit {checked} Jobs" if checked > 1 else "Submit Job"
self.submit_button.setText(message)
self.submit_button.setEnabled(bool(checked))
def _resolution_preset_changed(self, index):
selected_res = COMMON_RESOLUTIONS.get(self.resolution_options_list.currentText())
if selected_res:
self.resolution_x_input.setValue(selected_res[0])
self.resolution_y_input.setValue(selected_res[1])
elif index == 0:
self.resolution_x_input.setValue(self.project_info.get('resolution_x'))
self.resolution_y_input.setValue(self.project_info.get('resolution_y'))
def _fps_preset_changed(self, index):
selected_fps = COMMON_FRAME_RATES.get(self.fps_options_list.currentText())
if selected_fps:
self.fps_input.setValue(selected_fps)
elif index == 0:
self.fps_input.setValue(self.project_info.get('fps'))
def engine_changed(self):
# load the version numbers
current_engine = self.engine_type.currentText().lower() or self.engine_type.itemText(0)
self.engine_version_combo.clear()
self.engine_version_combo.addItem('latest')
self.file_format_combo.clear()
if current_engine:
engine_info = self.server_proxy.get_engine_info(current_engine, 'full', timeout=10)
self.current_engine_options = engine_info.get('options', [])
if not engine_info:
raise FileNotFoundError(f"Cannot get information about engine '{current_engine}'")
engine_vers = [v['version'] for v in engine_info['versions']]
self.engine_version_combo.addItems(engine_vers)
self.file_format_combo.addItems(engine_info.get('supported_export_formats'))
def update_server_list(self):
clients = ZeroconfServer.found_hostnames()
self.server_input.clear()
self.server_input.addItems(clients)
def browse_scene_file(self):
file_name, _ = QFileDialog.getOpenFileName(self, "Select Scene File")
if file_name:
self.scene_file_input.setText(file_name)
self.setup_project()
def setup_project(self):
# UI stuff on main thread
self.process_progress_bar.setHidden(False)
self.process_label.setHidden(False)
self.toggle_engine_enablement(False)
output_name = Path(self.scene_file_input.text()).stem.replace(' ', '_')
self.job_name_input.setText(output_name)
file_name = self.scene_file_input.text()
# setup bg worker
self.worker_thread = GetProjectInfoWorker(window=self, project_path=file_name)
self.worker_thread.message_signal.connect(self.post_get_project_info_update)
self.worker_thread.error_signal.connect(self.show_error_message)
self.worker_thread.start()
def browse_output_path(self):
directory = QFileDialog.getExistingDirectory(self, "Select Output Directory")
if directory:
self.job_name_input.setText(directory)
def args_help_button_clicked(self):
url = (f'http://{self.server_proxy.hostname}:{self.server_proxy.port}/api/engine/'
f'{self.engine_type.currentText()}/help')
self.engine_help_viewer = EngineHelpViewer(url)
self.engine_help_viewer.show()
def show_error_message(self, message):
msg = QMessageBox(self)
msg.setIcon(QMessageBox.Icon.Critical)
msg.setWindowTitle("Error")
msg.setText(message)
msg.exec()
# -------- Update --------
def post_get_project_info_update(self):
"""Called by the GetProjectInfoWorker - Do not call directly."""
try:
self.engine_type.addItems(self.installed_engines.keys())
self.engine_type.setCurrentText(self.preferred_engine)
self.engine_changed()
# Set the best engine we can find
input_path = self.scene_file_input.text()
engine = EngineManager.engine_class_for_project_path(input_path)
engine_index = self.engine_type.findText(engine.name().lower())
if engine_index >= 0:
self.engine_type.setCurrentIndex(engine_index)
else:
self.engine_type.setCurrentIndex(0) #todo: find out why we don't have engine info yet
# not ideal but if we don't have the engine info we have to pick something
# cleanup progress UI
self.load_file_group.setHidden(True)
self.toggle_engine_enablement(True)
# Load scene data
self.start_frame_input.setValue(self.project_info.get('frame_start'))
self.end_frame_input.setValue(self.project_info.get('frame_end'))
self.resolution_x_input.setValue(self.project_info.get('resolution_x'))
self.resolution_y_input.setValue(self.project_info.get('resolution_y'))
self.fps_input.setValue(self.project_info.get('fps'))
# Cameras
self.cameras_list.clear()
index = self.tabs.indexOf(self.cameras_group)
if self.project_info.get('cameras'):
self.tabs.setTabEnabled(index, True)
found_active = False
for camera in self.project_info['cameras']:
# create the list items and make them checkable
item = QListWidgetItem(f"{camera['name']} - {camera['lens']}mm")
item.setFlags(item.flags() | Qt.ItemFlag.ItemIsUserCheckable)
is_checked = camera['is_active'] or len(self.project_info['cameras']) == 1
found_active = found_active or is_checked
item.setCheckState(Qt.CheckState.Checked if is_checked else Qt.CheckState.Unchecked)
self.cameras_list.addItem(item)
if not found_active:
self.cameras_list.item(0).setCheckState(Qt.CheckState.Checked)
else:
self.tabs.setTabEnabled(index, False)
self.update_job_count()
# Dynamic Engine Options
clear_layout(self.engine_options_layout) # clear old options
# dynamically populate option list
for option in self.current_engine_options:
h_layout = QHBoxLayout()
label = QLabel(option['name'].replace('_', ' ').capitalize() + ':')
h_layout.addWidget(label)
if option.get('options'):
combo_box = QComboBox()
for opt in option['options']:
combo_box.addItem(opt)
h_layout.addWidget(combo_box)
else:
text_box = QLineEdit()
h_layout.addWidget(text_box)
self.engine_options_layout.addLayout(h_layout)
except AttributeError:
pass
def toggle_engine_enablement(self, enabled=False):
"""Toggle on/off all the render settings"""
indexes = [self.tabs.indexOf(self.project_group),
self.tabs.indexOf(self.output_settings_group),
self.tabs.indexOf(self.engine_group),
self.tabs.indexOf(self.cameras_group),
self.tabs.indexOf(self.notes_group)]
for idx in indexes:
self.tabs.setTabEnabled(idx, enabled)
self.submit_button.setEnabled(enabled)
def after_job_submission(self, error_string):
# UI cleanup
self.submit_progress.setMaximum(0)
self.submit_button.setHidden(False)
self.submit_progress.setHidden(True)
self.submit_progress_label.setHidden(True)
self.process_progress_bar.setHidden(True)
self.process_label.setHidden(True)
self.toggle_engine_enablement(True)
self.msg_box = QMessageBox()
if not error_string:
self.msg_box.setStandardButtons(QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No)
self.msg_box.setIcon(QMessageBox.Icon.Information)
self.msg_box.setText("Job successfully submitted to server. Submit another?")
self.msg_box.setWindowTitle("Success")
x = self.msg_box.exec()
if x == QMessageBox.StandardButton.No:
self.close()
else:
self.msg_box.setStandardButtons(QMessageBox.StandardButton.Ok)
self.msg_box.setIcon(QMessageBox.Icon.Critical)
self.msg_box.setText(error_string)
self.msg_box.setWindowTitle("Error")
self.msg_box.exec()
# -------- Submit Job Calls --------
def submit_job(self):
# Pre-worker UI
self.submit_progress.setHidden(False)
self.submit_progress_label.setHidden(False)
self.submit_button.setHidden(True)
self.submit_progress.setMaximum(0)
# submit job in background thread
self.worker_thread = SubmitWorker(window=self)
self.worker_thread.update_ui_signal.connect(self.update_submit_progress)
self.worker_thread.message_signal.connect(self.after_job_submission)
self.worker_thread.start()
@pyqtSlot(str, str)
def update_submit_progress(self, hostname, percent):
# Update the UI here. This slot will be executed in the main thread
self.submit_progress_label.setText(f"Transferring to {hostname} - {percent}%")
self.submit_progress.setMaximum(100)
self.submit_progress.setValue(int(percent))
class SubmitWorker(QThread):
"""Worker class called to submit all the jobs to the server and update the UI accordingly"""
message_signal = pyqtSignal(str)
update_ui_signal = pyqtSignal(str, str)
def __init__(self, window):
super().__init__()
self.window = window
def run(self):
def create_callback(encoder):
encoder_len = encoder.len
def callback(monitor):
percent = f"{monitor.bytes_read / encoder_len * 100:.0f}"
self.update_ui_signal.emit(hostname, percent)
return callback
try:
hostname = self.window.server_input.currentText()
resolution = (self.window.resolution_x_input.text(), self.window.resolution_y_input.text())
job_json = {'owner': psutil.Process().username() + '@' + socket.gethostname(),
'engine_name': self.window.engine_type.currentText().lower(),
'engine_version': self.window.engine_version_combo.currentText(),
'args': {'raw': self.window.raw_args.text(),
'export_format': self.window.file_format_combo.currentText(),
'resolution': resolution,
'fps': self.window.fps_input.text(),},
'output_path': self.window.job_name_input.text(),
'start_frame': self.window.start_frame_input.value(),
'end_frame': self.window.end_frame_input.value(),
'priority': self.window.priority_input.currentIndex() + 1,
'notes': self.window.notes_input.toPlainText(),
'enable_split_jobs': self.window.enable_splitjobs.isChecked(),
'split_jobs_same_os': self.window.splitjobs_same_os.isChecked(),
'name': self.window.job_name_input.text()}
# get the dynamic args
for i in range(self.window.engine_options_layout.count()):
item = self.window.engine_options_layout.itemAt(i)
layout = item.layout() # get the layout
for x in range(layout.count()):
z = layout.itemAt(x)
widget = z.widget()
if isinstance(widget, QComboBox):
job_json['args'][self.window.current_engine_options[i]['name']] = widget.currentText()
elif isinstance(widget, QLineEdit):
job_json['args'][self.window.current_engine_options[i]['name']] = widget.text()
# determine if any cameras are checked
selected_cameras = []
if self.window.cameras_list.count() and not self.window.cameras_group.isHidden():
for index in range(self.window.cameras_list.count()):
item = self.window.cameras_list.item(index)
if item.checkState() == Qt.CheckState.Checked:
selected_cameras.append(item.text().rsplit('-', 1)[0].strip()) # cleanup to just camera name
# process cameras into nested format
input_path = self.window.scene_file_input.text()
if selected_cameras and self.window.cameras_list.count() > 1:
children_jobs = []
for cam in selected_cameras:
child_job_data = dict()
child_job_data['args'] = {}
child_job_data['args']['camera'] = cam
child_job_data['name'] = job_json['name'].replace(' ', '-') + "_" + cam.replace(' ', '')
child_job_data['output_path'] = child_job_data['name']
children_jobs.append(child_job_data)
job_json['child_jobs'] = children_jobs
# presubmission tasks - use local installs
engine_class = EngineManager.engine_class_with_name(self.window.engine_type.currentText().lower())
latest_engine = EngineManager.get_latest_engine_instance(engine_class)
input_path = Path(latest_engine.perform_presubmission_tasks(input_path))
# submit
err_msg = ""
result = self.window.server_proxy.post_job_to_server(file_path=input_path, job_data=job_json,
callback=create_callback)
if not (result and result.ok):
err_msg = f"Error posting job to server: {result.text}"
self.message_signal.emit(err_msg)
except Exception as e:
self.message_signal.emit(str(e))
class GetProjectInfoWorker(QThread):
"""Worker class called to retrieve information about a project file on a background thread and update the UI"""
message_signal = pyqtSignal()
error_signal = pyqtSignal(str)
def __init__(self, window, project_path):
super().__init__()
self.window = window
self.project_path = project_path
def run(self):
try:
# get the engine info and add them all to the ui
self.window.installed_engines = self.window.server_proxy.get_installed_engines()
# select the best engine for the file type
self.window.preferred_engine = self.window.server_proxy.get_engine_for_filename(self.project_path)
# this should be the only time we use a local engine instead of using the proxy besides submitting
engine_class = EngineManager.engine_class_for_project_path(self.project_path)
engine = EngineManager.get_latest_engine_instance(engine_class)
self.window.project_info = engine.get_project_info(self.project_path)
self.message_signal.emit()
except Exception as e:
self.error_signal.emit(str(e))
def clear_layout(layout):
if layout is not None:
# Go through the layout's items in reverse order
for i in reversed(range(layout.count())):
# Take the item at the current position
item = layout.takeAt(i)
# Check if the item is a widget
if item.widget():
# Remove the widget and delete it
widget_to_remove = item.widget()
widget_to_remove.setParent(None)
widget_to_remove.deleteLater()
elif item.layout():
# If the item is a sub-layout, clear its contents recursively
clear_layout(item.layout())
# Then delete the layout
item.layout().deleteLater()
# Run the application
if __name__ == '__main__':
app = QApplication([])
window = NewRenderJobForm()
app.exec()

View File

@@ -1,3 +1,4 @@
import sys
import logging
from PyQt6.QtGui import QFont
@@ -15,10 +16,7 @@ class QSignalHandler(logging.Handler, QObject):
def emit(self, record):
msg = self.format(record)
try:
self.new_record.emit(msg) # Emit signal
except RuntimeError:
pass
self.new_record.emit(msg) # Emit signal
class ConsoleWindow(QMainWindow):

View File

@@ -4,7 +4,6 @@ import subprocess
import sys
import threading
from PyQt6.QtCore import QTimer
from PyQt6.QtWidgets import (
QMainWindow, QWidget, QVBoxLayout, QPushButton, QTableWidget, QTableWidgetItem, QHBoxLayout, QAbstractItemView,
QHeaderView, QProgressBar, QLabel, QMessageBox
@@ -12,7 +11,7 @@ from PyQt6.QtWidgets import (
from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.utilities.misc_helper import is_localhost, launch_url
from src.utilities.misc_helper import is_localhost
class EngineBrowserWindow(QMainWindow):
@@ -29,7 +28,6 @@ class EngineBrowserWindow(QMainWindow):
self.setGeometry(100, 100, 500, 300)
self.engine_data = []
self.initUI()
self.init_timer()
def initUI(self):
# Central widget
@@ -84,21 +82,15 @@ class EngineBrowserWindow(QMainWindow):
self.update_download_status()
def init_timer(self):
# Set up the timer
self.timer = QTimer(self)
self.timer.timeout.connect(self.update_download_status)
self.timer.start(1000)
def update_table(self):
def update_table_worker():
raw_server_data = RenderServerProxy(self.hostname).get_all_engine_info()
raw_server_data = RenderServerProxy(self.hostname).get_renderer_info()
if not raw_server_data:
return
table_data = [] # convert the data into a flat list
for _, engine_data in raw_server_data.items():
for engine_name, engine_data in raw_server_data.items():
table_data.extend(engine_data['versions'])
self.engine_data = table_data
@@ -128,23 +120,25 @@ class EngineBrowserWindow(QMainWindow):
self.launch_button.setEnabled(is_localhost(self.hostname))
def update_download_status(self):
running_tasks = EngineManager.active_downloads()
running_tasks = [x for x in EngineManager.download_tasks if x.is_alive()]
hide_progress = not bool(running_tasks)
self.progress_bar.setHidden(hide_progress)
self.progress_label.setHidden(hide_progress)
# Update the status labels
if len(running_tasks) == 0:
new_status = ""
elif len(running_tasks) == 1:
task = running_tasks[0]
new_status = f"Downloading {task.engine.capitalize()} {task.version}..."
else:
new_status = f"Downloading {len(running_tasks)} engines..."
self.progress_label.setText(new_status)
# todo: update progress bar with status
self.progress_label.setText(f"Downloading {len(running_tasks)} engines")
def launch_button_click(self):
engine_info = self.engine_data[self.table_widget.currentRow()]
launch_url(engine_info['path'])
path = engine_info['path']
if sys.platform.startswith('darwin'):
subprocess.run(['open', path])
elif sys.platform.startswith('win32'):
os.startfile(path)
elif sys.platform.startswith('linux'):
subprocess.run(['xdg-open', path])
else:
raise OSError("Unsupported operating system")
def install_button_click(self):
self.update_download_status()

View File

@@ -1,39 +1,34 @@
''' app/ui/main_window.py '''
import ast
import datetime
import io
import logging
import os
import socket
import subprocess
import sys
import threading
import time
from typing import List, Dict, Any, Optional
import PIL
import humanize
from PIL import Image
from PyQt6.QtCore import Qt, QByteArray, QBuffer, QIODevice, QThread, pyqtSignal
from PyQt6.QtCore import Qt, QByteArray, QBuffer, QIODevice, QThread
from PyQt6.QtGui import QPixmap, QImage, QFont, QIcon
from PyQt6.QtWidgets import QMainWindow, QWidget, QHBoxLayout, QListWidget, QTableWidget, QAbstractItemView, \
QTableWidgetItem, QLabel, QVBoxLayout, QHeaderView, QMessageBox, QGroupBox, QPushButton, QListWidgetItem, \
QFileDialog
from src.api.api_server import API_VERSION
from src.api.serverproxy_manager import ServerProxyManager
from src.api.server_proxy import RenderServerProxy
from src.render_queue import RenderQueue
from src.ui.add_job_window import NewRenderJobForm
from src.ui.console_window import ConsoleWindow
from src.ui.engine_browser import EngineBrowserWindow
from src.ui.log_window import LogViewer
from src.ui.widgets.menubar import MenuBar
from src.ui.widgets.proportional_image_label import ProportionalImageLabel
from src.ui.widgets.statusbar import StatusBar
from src.ui.widgets.toolbar import ToolBar
from src.utilities.misc_helper import get_time_elapsed, resources_dir, is_localhost
from src.utilities.misc_helper import launch_url
from src.utilities.status_utils import RenderStatus
from src.utilities.zeroconf_server import ZeroconfServer
from src.version import APP_NAME
from .add_job import NewRenderJobForm
from .console import ConsoleWindow
from .engine_browser import EngineBrowserWindow
from .log_viewer import LogViewer
from .widgets.menubar import MenuBar
from .widgets.proportional_image_label import ProportionalImageLabel
from .widgets.statusbar import StatusBar
from .widgets.toolbar import ToolBar
from src.api.serverproxy_manager import ServerProxyManager
logger = logging.getLogger()
@@ -53,12 +48,6 @@ class MainWindow(QMainWindow):
super().__init__()
# Load the queue
self.job_list_view: Optional[QTableWidget] = None
self.server_info_ram: Optional[str] = None
self.server_info_cpu: Optional[str] = None
self.server_info_os: Optional[str] = None
self.server_info_gpu: Optional[List[Dict[str, Any]]] = None
self.server_info_hostname: Optional[str] = None
self.engine_browser_window = None
self.server_info_group = None
self.current_hostname = None
@@ -68,7 +57,7 @@ class MainWindow(QMainWindow):
self.buffer_handler = None
# Window-Settings
self.setWindowTitle(APP_NAME)
self.setWindowTitle("Zordon")
self.setGeometry(100, 100, 900, 800)
central_widget = QWidget(self)
self.setCentralWidget(central_widget)
@@ -78,7 +67,7 @@ class MainWindow(QMainWindow):
# Create a QLabel widget to display the image
self.image_label = ProportionalImageLabel()
self.image_label.setMaximumSize(700, 500)
self.image_label.setFixedHeight(300)
self.image_label.setFixedHeight(500)
self.image_label.setAlignment(Qt.AlignmentFlag.AlignTop | Qt.AlignmentFlag.AlignHCenter)
self.load_image_path(os.path.join(resources_dir(), 'Rectangle.png'))
@@ -92,18 +81,15 @@ class MainWindow(QMainWindow):
self.setup_ui(main_layout)
self.create_toolbars()
# Add Widgets to Window
# self.custom_menu_bar =
self.setMenuBar(MenuBar(self))
self.setStatusBar(StatusBar(self))
self.create_toolbars()
# start background update
self.found_servers = []
self.job_data = {}
self.bg_update_thread = BackgroundUpdater(window=self)
self.bg_update_thread.updated_signal.connect(self.update_ui_data)
self.bg_update_thread = QThread()
self.bg_update_thread.run = self.__background_update
self.bg_update_thread.start()
# Setup other windows
@@ -114,12 +100,7 @@ class MainWindow(QMainWindow):
# Pick default job
self.job_picked()
def setup_ui(self, main_layout: QVBoxLayout) -> None:
"""Setup the main user interface layout.
Args:
main_layout: The main layout container for the UI widgets.
"""
def setup_ui(self, main_layout):
# Servers
server_list_group = QGroupBox("Available Servers")
@@ -134,7 +115,6 @@ class MainWindow(QMainWindow):
self.server_info_os = QLabel()
self.server_info_cpu = QLabel()
self.server_info_ram = QLabel()
self.server_info_gpu = QLabel()
server_info_engines_button = QPushButton("Render Engines")
server_info_engines_button.clicked.connect(self.engine_browser)
server_info_layout = QVBoxLayout()
@@ -142,7 +122,6 @@ class MainWindow(QMainWindow):
server_info_layout.addWidget(self.server_info_os)
server_info_layout.addWidget(self.server_info_cpu)
server_info_layout.addWidget(self.server_info_ram)
server_info_layout.addWidget(self.server_info_gpu)
server_info_layout.addWidget(server_info_engines_button)
server_info_group.setLayout(server_info_layout)
@@ -169,41 +148,40 @@ class MainWindow(QMainWindow):
self.job_list_view.verticalHeader().setVisible(False)
self.job_list_view.itemSelectionChanged.connect(self.job_picked)
self.job_list_view.setEditTriggers(QAbstractItemView.EditTrigger.NoEditTriggers)
self.refresh_job_headers()
# Setup Job Headers
self.job_list_view.setHorizontalHeaderLabels(["ID", "Name", "Engine", "Priority", "Status",
"Time Elapsed", "Frames", "Date Created"])
self.job_list_view.setColumnHidden(0, True)
# Image Layout
image_group = QGroupBox("Job Preview")
image_layout = QVBoxLayout(image_group)
image_layout.setContentsMargins(0, 0, 0, 0)
image_center_layout = QHBoxLayout()
image_center_layout.addWidget(self.image_label)
image_layout.addWidget(self.image_label)
# image_layout.addLayout(image_center_layout)
self.job_list_view.horizontalHeader().setSectionResizeMode(1, QHeaderView.ResizeMode.Stretch)
self.job_list_view.horizontalHeader().setSectionResizeMode(2, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(3, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(4, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(5, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(6, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(7, QHeaderView.ResizeMode.ResizeToContents)
# Job List Layout
job_list_group = QGroupBox("Job Preview")
# Job Layout
job_list_group = QGroupBox("Render Jobs")
job_list_layout = QVBoxLayout(job_list_group)
job_list_layout.setContentsMargins(0, 0, 0, 0)
job_list_layout.addWidget(self.image_label)
job_list_layout.addWidget(self.job_list_view, stretch=True)
image_layout.addWidget(self.job_list_view, stretch=True)
image_layout.addLayout(job_list_layout)
# Add them all to the window
main_layout.addLayout(info_layout)
right_layout = QVBoxLayout()
right_layout.setContentsMargins(0, 0, 0, 0)
right_layout.addWidget(job_list_group)
right_layout.addWidget(image_group)
# right_layout.addWidget(job_list_group)
main_layout.addLayout(right_layout)
def closeEvent(self, event):
"""Handle window close event with job running confirmation.
def __background_update(self):
while True:
self.update_servers()
self.fetch_jobs()
time.sleep(0.5)
Args:
event: The close event triggered by user.
"""
def closeEvent(self, event):
running_jobs = len(RenderQueue.running_jobs())
if running_jobs:
reply = QMessageBox.question(self, "Running Jobs",
@@ -216,12 +194,7 @@ class MainWindow(QMainWindow):
else:
event.ignore()
# -- Server Code -- #
def refresh_job_list(self):
"""Refresh the job list display."""
self.job_list_view.clearContents()
self.bg_update_thread.needs_update = True
# -- Server Code -- #
@property
def current_server_proxy(self):
@@ -238,7 +211,7 @@ class MainWindow(QMainWindow):
# Update the current hostname and clear the job list
self.current_hostname = new_hostname
self.job_list_view.setRowCount(0)
self.refresh_job_list()
self.fetch_jobs(clear_table=True)
# Select the first row if there are jobs listed
if self.job_list_view.rowCount():
@@ -253,56 +226,31 @@ class MainWindow(QMainWindow):
def update_server_info_display(self, hostname):
"""Updates the server information section of the UI."""
self.server_info_hostname.setText(f"Name: {hostname}")
self.server_info_hostname.setText(hostname or "unknown")
server_info = ZeroconfServer.get_hostname_properties(hostname)
# Use the get method with defaults to avoid KeyError
os_info = f"OS: {server_info.get('system_os', 'Unknown')} {server_info.get('system_os_version', '')}"
cleaned_cpu_name = server_info.get('system_cpu_brand', 'Unknown').replace(' CPU','').replace('(TM)','').replace('(R)', '')
cpu_info = f"CPU: {cleaned_cpu_name} ({server_info.get('system_cpu_cores', 'Unknown')} cores)"
memory_info = f"RAM: {server_info.get('system_memory', 'Unknown')} GB"
# Get and format GPU info
try:
gpu_list = ast.literal_eval(server_info.get('gpu_info', []))
# Format all GPUs
gpu_info_parts = []
for gpu in gpu_list:
gpu_name = gpu.get('name', 'Unknown').replace('(TM)','').replace('(R)', '')
gpu_memory = gpu.get('memory', 'Unknown')
# Add " GB" suffix if memory is a number
if isinstance(gpu_memory, (int, float)) or (isinstance(gpu_memory, str) and gpu_memory.isdigit()):
gpu_memory_str = f"{gpu_memory} GB"
else:
gpu_memory_str = str(gpu_memory)
gpu_info_parts.append(f"{gpu_name} ({gpu_memory_str})")
gpu_info = f"GPU: {', '.join(gpu_info_parts)}" if gpu_info_parts else "GPU: Unknown"
except Exception as e:
logger.error(f"Error parsing GPU info: {e}")
gpu_info = "GPU: Unknown"
cpu_info = f"CPU: {server_info.get('system_cpu', 'Unknown')} - {server_info.get('system_cpu_cores', 'Unknown')} cores"
self.server_info_os.setText(os_info.strip())
self.server_info_cpu.setText(cpu_info)
self.server_info_ram.setText(memory_info)
self.server_info_gpu.setText(gpu_info)
def update_ui_data(self):
"""Update UI data with current server and job information."""
self.update_servers()
def fetch_jobs(self, clear_table=False):
if not self.current_server_proxy:
return
server_job_data = self.job_data.get(self.current_server_proxy.hostname)
if server_job_data:
num_jobs = len(server_job_data)
if clear_table:
self.job_list_view.clear()
self.refresh_job_headers()
job_fetch = self.current_server_proxy.get_all_jobs(ignore_token=clear_table)
if job_fetch:
num_jobs = len(job_fetch)
self.job_list_view.setRowCount(num_jobs)
for row, job in enumerate(server_job_data):
for row, job in enumerate(job_fetch):
display_status = job['status'] if job['status'] != RenderStatus.RUNNING.value else \
('%.0f%%' % (job['percent_complete'] * 100)) # if running, show percent, otherwise just show status
@@ -314,43 +262,31 @@ class MainWindow(QMainWindow):
get_time_elapsed(start_time, end_time)
name = job.get('name') or os.path.basename(job.get('input_path', ''))
engine_name = f"{job.get('engine', '')}-{job.get('engine_version')}"
renderer = f"{job.get('renderer', '')}-{job.get('renderer_version')}"
priority = str(job.get('priority', ''))
total_frames = str(job.get('total_frames', ''))
converted_time = datetime.datetime.fromisoformat(job['date_created'])
humanized_time = humanize.naturaltime(converted_time)
items = [QTableWidgetItem(job['id']), QTableWidgetItem(name), QTableWidgetItem(engine_name),
items = [QTableWidgetItem(job['id']), QTableWidgetItem(name), QTableWidgetItem(renderer),
QTableWidgetItem(priority), QTableWidgetItem(display_status), QTableWidgetItem(time_elapsed),
QTableWidgetItem(total_frames), QTableWidgetItem(humanized_time)]
QTableWidgetItem(total_frames), QTableWidgetItem(job['date_created'])]
for col, item in enumerate(items):
self.job_list_view.setItem(row, col, item)
# -- Job Code -- #
# -- Job Code -- #
def job_picked(self):
def fetch_preview(job_id):
try:
default_image_path = "error.png"
before_fetch_hostname = self.current_server_proxy.hostname
response = self.current_server_proxy.request(f'job/{job_id}/thumbnail?size=big')
if response.ok:
try:
with io.BytesIO(response.content) as image_data_stream:
image = Image.open(image_data_stream)
if self.current_server_proxy.hostname == before_fetch_hostname and job_id == \
self.selected_job_ids()[0]:
self.load_image_data(image)
return
except PIL.UnidentifiedImageError:
default_image_path = response.text
else:
default_image_path = default_image_path or response.text
self.load_image_path(os.path.join(resources_dir(), default_image_path))
import io
image_data = response.content
image = Image.open(io.BytesIO(image_data))
if self.current_server_proxy.hostname == before_fetch_hostname and job_id == \
self.selected_job_ids()[0]:
self.load_image_data(image)
except ConnectionError as e:
logger.error(f"Connection error fetching image: {e}")
except Exception as e:
@@ -368,7 +304,7 @@ class MainWindow(QMainWindow):
current_status = self.job_list_view.item(selected_row.row(), 4).text()
# show / hide the stop button
show_stop_button = "%" in current_status
show_stop_button = current_status.lower() == 'running'
self.topbar.actions_call['Stop Job'].setEnabled(show_stop_button)
self.topbar.actions_call['Stop Job'].setVisible(show_stop_button)
self.topbar.actions_call['Delete Job'].setEnabled(not show_stop_button)
@@ -393,85 +329,79 @@ class MainWindow(QMainWindow):
self.topbar.actions_call['Open Files'].setVisible(False)
def selected_job_ids(self):
"""Get list of selected job IDs from the job list.
selected_rows = self.job_list_view.selectionModel().selectedRows()
job_ids = []
for selected_row in selected_rows:
id_item = self.job_list_view.item(selected_row.row(), 0)
job_ids.append(id_item.text())
return job_ids
Returns:
List[str]: List of selected job ID strings.
"""
try:
selected_rows = self.job_list_view.selectionModel().selectedRows()
job_ids = []
for selected_row in selected_rows:
id_item = self.job_list_view.item(selected_row.row(), 0)
job_ids.append(id_item.text())
return job_ids
except AttributeError:
return []
def refresh_job_headers(self):
self.job_list_view.setHorizontalHeaderLabels(["ID", "Name", "Renderer", "Priority", "Status",
"Time Elapsed", "Frames", "Date Created"])
self.job_list_view.setColumnHidden(0, True)
self.job_list_view.horizontalHeader().setSectionResizeMode(1, QHeaderView.ResizeMode.Stretch)
self.job_list_view.horizontalHeader().setSectionResizeMode(2, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(3, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(4, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(5, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(6, QHeaderView.ResizeMode.ResizeToContents)
self.job_list_view.horizontalHeader().setSectionResizeMode(7, QHeaderView.ResizeMode.ResizeToContents)
# -- Image Code -- #
# -- Image Code -- #
def load_image_path(self, image_path):
"""Load and display an image from file path.
Args:
image_path: Path to the image file to load.
"""
# Load and set image using QPixmap
try:
pixmap = QPixmap(image_path)
if not pixmap:
logger.error("Error loading image")
return
self.image_label.setPixmap(pixmap)
except Exception as e:
logger.error(f"Error loading image path: {e}")
# Load and set the image using QPixmap
pixmap = QPixmap(image_path)
if not pixmap:
logger.error("Error loading image")
return
self.image_label.setPixmap(pixmap)
def load_image_data(self, pillow_image):
try:
# Convert the Pillow Image to a QByteArray (byte buffer)
byte_array = QByteArray()
buffer = QBuffer(byte_array)
buffer.open(QIODevice.OpenModeFlag.WriteOnly)
pillow_image.save(buffer, "PNG")
buffer.close()
# Convert the Pillow Image to a QByteArray (byte buffer)
byte_array = QByteArray()
buffer = QBuffer(byte_array)
buffer.open(QIODevice.OpenModeFlag.WriteOnly)
pillow_image.save(buffer, "PNG")
buffer.close()
# Create a QImage from the QByteArray
image = QImage.fromData(byte_array)
# Create a QImage from the QByteArray
image = QImage.fromData(byte_array)
# Create a QPixmap from the QImage
pixmap = QPixmap.fromImage(image)
# Create a QPixmap from the QImage
pixmap = QPixmap.fromImage(image)
if not pixmap:
logger.error("Error loading image")
return
self.image_label.setPixmap(pixmap)
except Exception as e:
logger.error(f"Error loading image data: {e}")
if not pixmap:
logger.error("Error loading image")
return
self.image_label.setPixmap(pixmap)
def update_servers(self):
found_servers = list(set(ZeroconfServer.found_hostnames() + self.added_hostnames))
# Always make sure local hostname is first
if self.found_servers and not is_localhost(self.found_servers[0]):
for hostname in self.found_servers:
if found_servers and not is_localhost(found_servers[0]):
for hostname in found_servers:
if is_localhost(hostname):
self.found_servers.remove(hostname)
self.found_servers.insert(0, hostname)
found_servers.remove(hostname)
found_servers.insert(0, hostname)
break
old_count = self.server_list_view.count()
# Update proxys
for hostname in self.found_servers:
for hostname in found_servers:
ServerProxyManager.get_proxy_for_hostname(hostname) # setup background updates
# Add in all the missing servers
current_server_list = []
for i in range(self.server_list_view.count()):
current_server_list.append(self.server_list_view.item(i).text())
for hostname in self.found_servers:
for hostname in found_servers:
if hostname not in current_server_list:
properties = ZeroconfServer.get_hostname_properties(hostname)
image_path = os.path.join(resources_dir(), f"{properties.get('system_os', 'Monitor')}.png")
image_path = os.path.join(resources_dir(), 'icons', f"{properties.get('system_os', 'Monitor')}.png")
list_widget = QListWidgetItem(QIcon(image_path), hostname)
self.server_list_view.addItem(list_widget)
@@ -479,7 +409,7 @@ class MainWindow(QMainWindow):
servers_to_remove = []
for i in range(self.server_list_view.count()):
name = self.server_list_view.item(i).text()
if name not in self.found_servers:
if name not in found_servers:
servers_to_remove.append(name)
# remove any servers that shouldn't be shown any longer
@@ -508,25 +438,26 @@ class MainWindow(QMainWindow):
# Top Toolbar Buttons
self.topbar.add_button(
"Settings", f"{resources_directory}/Gear.png", self.menuBar().show_settings)
"New Job", f"{resources_directory}/icons/AddProduct.png", self.new_job)
self.topbar.add_button(
"Console", f"{resources_directory}/Console.png", self.open_console_window)
"Engines", f"{resources_directory}/icons/SoftwareInstaller.png", self.engine_browser)
self.topbar.add_button(
"Console", f"{resources_directory}/icons/Console.png", self.open_console_window)
self.topbar.add_separator()
self.topbar.add_button(
"Stop Job", f"{resources_directory}/StopSign.png", self.stop_job)
"Stop Job", f"{resources_directory}/icons/StopSign.png", self.stop_job)
self.topbar.add_button(
"Delete Job", f"{resources_directory}/Trash.png", self.delete_job)
"Delete Job", f"{resources_directory}/icons/Trash.png", self.delete_job)
self.topbar.add_button(
"Render Log", f"{resources_directory}/Document.png", self.job_logs)
"Render Log", f"{resources_directory}/icons/Document.png", self.job_logs)
self.topbar.add_button(
"Download", f"{resources_directory}/Download.png", self.download_files)
"Download", f"{resources_directory}/icons/Download.png", self.download_files)
self.topbar.add_button(
"Open Files", f"{resources_directory}/SearchFolder.png", self.open_files)
self.topbar.add_button(
"New Job", f"{resources_directory}/AddProduct.png", self.new_job)
"Open Files", f"{resources_directory}/icons/SearchFolder.png", self.open_files)
self.addToolBar(Qt.ToolBarArea.TopToolBarArea, self.topbar)
# -- Toolbar Buttons -- #
# -- Toolbar Buttons -- #
def open_console_window(self) -> None:
"""
@@ -541,9 +472,8 @@ class MainWindow(QMainWindow):
self.engine_browser_window.show()
def job_logs(self) -> None:
"""Open log viewer for selected job.
Opens a log viewer window showing the logs for the currently selected job.
"""
Event handler for the "Logs" button.
"""
selected_job_ids = self.selected_job_ids()
if selected_job_ids:
@@ -552,10 +482,8 @@ class MainWindow(QMainWindow):
self.log_viewer_window.show()
def stop_job(self, event):
"""Stop selected render jobs with user confirmation.
Args:
event: The button click event.
"""
Event handler for the "Exit" button. Closes the application.
"""
job_ids = self.selected_job_ids()
if not job_ids:
@@ -565,26 +493,24 @@ class MainWindow(QMainWindow):
job = next((job for job in self.current_server_proxy.get_all_jobs() if job.get('id') == job_ids[0]), None)
if job:
display_name = job.get('name', os.path.basename(job.get('input_path', '')))
message = f"Are you sure you want to stop job: {display_name}?"
message = f"Are you sure you want to delete the job:\n{display_name}?"
else:
return # Job not found, handle this case as needed
else:
message = f"Are you sure you want to stop these {len(job_ids)} jobs?"
message = f"Are you sure you want to delete these {len(job_ids)} jobs?"
# Display the message box and check the response in one go
msg_box = QMessageBox(QMessageBox.Icon.Warning, "Stop Job", message,
msg_box = QMessageBox(QMessageBox.Icon.Warning, "Delete Job", message,
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No, self)
if msg_box.exec() == QMessageBox.StandardButton.Yes:
for job_id in job_ids:
self.current_server_proxy.cancel_job(job_id, confirm=True)
self.refresh_job_list()
self.fetch_jobs(clear_table=True)
def delete_job(self, event):
"""Delete selected render jobs with user confirmation.
Args:
event: The button click event.
"""
Event handler for the Delete Job button
"""
job_ids = self.selected_job_ids()
if not job_ids:
@@ -607,18 +533,10 @@ class MainWindow(QMainWindow):
if msg_box.exec() == QMessageBox.StandardButton.Yes:
for job_id in job_ids:
self.current_server_proxy.delete_job(job_id, confirm=True)
self.refresh_job_list()
self.fetch_jobs(clear_table=True)
def download_files(self, event):
job_ids = self.selected_job_ids()
if not job_ids:
return
import webbrowser
download_url = (f"http://{self.current_server_proxy.hostname}:{self.current_server_proxy.port}"
f"/api/job/{job_ids[0]}/download_all")
webbrowser.open(download_url)
pass
def open_files(self, event):
job_ids = self.selected_job_ids()
@@ -628,7 +546,15 @@ class MainWindow(QMainWindow):
for job_id in job_ids:
job_info = self.current_server_proxy.get_job_info(job_id)
path = os.path.dirname(job_info['output_path'])
launch_url(path)
if sys.platform.startswith('darwin'):
subprocess.run(['open', path])
elif sys.platform.startswith('win32'):
os.startfile(path)
elif sys.platform.startswith('linux'):
subprocess.run(['xdg-open', path])
else:
raise OSError("Unsupported operating system")
def new_job(self) -> None:
@@ -636,56 +562,3 @@ class MainWindow(QMainWindow):
if file_name:
self.new_job_window = NewRenderJobForm(file_name)
self.new_job_window.show()
class BackgroundUpdater(QThread):
"""Worker class to fetch job and server information and update the UI"""
updated_signal = pyqtSignal()
error_signal = pyqtSignal(str)
def __init__(self, window):
super().__init__()
self.window = window
self.needs_update = True
def run(self):
"""Main background thread execution loop.
Continuously fetches server and job data, updating the main UI
every second or when updates are needed.
"""
try:
last_run = 0
while True:
now = time.monotonic()
if now - last_run >= 1.0 or self.needs_update:
self.window.found_servers = list(set(ZeroconfServer.found_hostnames() + self.window.added_hostnames))
self.window.found_servers = [x for x in self.window.found_servers if
ZeroconfServer.get_hostname_properties(x)['api_version'] == API_VERSION]
if self.window.current_server_proxy:
self.window.job_data[self.window.current_server_proxy.hostname] = \
self.window.current_server_proxy.get_all_jobs(ignore_token=False)
self.needs_update = False
self.updated_signal.emit()
time.sleep(0.05)
except Exception as e:
print(f"ERROR: {e}")
self.error_signal.emit(str(e))
if __name__ == "__main__":
# lazy load GUI frameworks
from PyQt6.QtWidgets import QApplication
# load application
# QtCore.QCoreApplication.setAttribute(QtCore.Qt.ApplicationAttribute.AA_MacDontSwapCtrlAndMeta)
app: QApplication = QApplication(sys.argv)
# configure main main_window
main_window = MainWindow()
# main_window.buffer_handler = buffer_handler
app.setActiveWindow(main_window)
main_window.show()
sys.exit(app.exec())

View File

@@ -1,552 +0,0 @@
import os
import socket
from datetime import datetime
from pathlib import Path
import humanize
from PyQt6 import QtCore
from PyQt6.QtCore import Qt, QSettings, pyqtSignal as Signal, QThread, pyqtSignal, QTimer
from PyQt6.QtGui import QIcon
from PyQt6.QtWidgets import QApplication, QMainWindow, QListWidget, QListWidgetItem, QStackedWidget, QVBoxLayout, \
QWidget, QLabel, QCheckBox, QLineEdit, \
QPushButton, QHBoxLayout, QGroupBox, QTableWidget, QAbstractItemView, QTableWidgetItem, QHeaderView, \
QMessageBox, QProgressBar
from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.utilities.config import Config
from src.utilities.misc_helper import launch_url
from src.version import APP_AUTHOR, APP_NAME
settings = QSettings(APP_AUTHOR, APP_NAME)
class GetEngineInfoWorker(QThread):
"""
The GetEngineInfoWorker class fetches engine information from a server in a background thread.
Attributes:
done: A signal emitted when the engine information is retrieved.
Methods:
run(self): Fetches engine information from the server.
"""
done = pyqtSignal(object) # emits the result when finished
def __init__(self, parent=None):
super().__init__(parent)
self.parent = parent
def run(self):
data = RenderServerProxy(socket.gethostname()).get_all_engine_info()
self.done.emit(data)
class SettingsWindow(QMainWindow):
"""
The SettingsWindow class provides a user interface for managing engine settings.
"""
def __init__(self):
super().__init__()
self.engine_download_progress_bar = None
self.engines_last_update_label = None
self.check_for_engine_updates_checkbox = None
self.delete_engine_button = None
self.launch_engine_button = None
self.show_password_button = None
self.network_password_line = None
self.enable_network_password_checkbox = None
self.check_for_new_engines_button = None
if not EngineManager.engines_path: # fix issue where sometimes path was not set
EngineManager.engines_path = Path(Config.upload_folder).expanduser() / "engines"
self.installed_engines_table = None
self.setWindowTitle("Settings")
# Create the main layout
main_layout = QVBoxLayout()
# Create the sidebar (QListWidget) for navigation
self.sidebar = QListWidget()
self.sidebar.setFixedWidth(150)
# Set the icon size
self.sidebar.setIconSize(QtCore.QSize(32, 32)) # Increase the icon size to 32x32 pixels
# Adjust the font size for the sidebar items
font = self.sidebar.font()
font.setPointSize(12) # Increase the font size
self.sidebar.setFont(font)
# Add items with icons to the sidebar
resources_dir = os.path.join(Path(__file__).resolve().parent.parent.parent, 'resources')
self.add_sidebar_item("General", os.path.join(resources_dir, "Gear.png"))
self.add_sidebar_item("Server", os.path.join(resources_dir, "Server.png"))
self.add_sidebar_item("Engines", os.path.join(resources_dir, "Blender.png"))
self.sidebar.setCurrentRow(0)
# Create the stacked widget to hold different settings pages
self.stacked_widget = QStackedWidget()
# Create pages for each section
general_page = self.create_general_page()
network_page = self.create_network_page()
engines_page = self.create_engines_page()
# Add pages to the stacked widget
self.stacked_widget.addWidget(general_page)
self.stacked_widget.addWidget(network_page)
self.stacked_widget.addWidget(engines_page)
# Connect the sidebar to the stacked widget
self.sidebar.currentRowChanged.connect(self.stacked_widget.setCurrentIndex)
# Create a horizontal layout to hold the sidebar and stacked widget
content_layout = QHBoxLayout()
content_layout.addWidget(self.sidebar)
content_layout.addWidget(self.stacked_widget)
# Add the content layout to the main layout
main_layout.addLayout(content_layout)
# Add the "OK" button at the bottom
ok_button = QPushButton("OK")
ok_button.clicked.connect(self.close)
ok_button.setFixedWidth(80)
ok_button.setDefault(True)
main_layout.addWidget(ok_button, alignment=Qt.AlignmentFlag.AlignRight)
# Create a central widget and set the layout
central_widget = QWidget()
central_widget.setLayout(main_layout)
self.setCentralWidget(central_widget)
self.setMinimumSize(700, 400)
# timers for background download UI updates
self.timer = QTimer(self)
self.timer.timeout.connect(self.update_engine_download_status)
def add_sidebar_item(self, name, icon_path):
"""Add an item with an icon to the sidebar."""
item = QListWidgetItem(QIcon(icon_path), name)
self.sidebar.addItem(item)
def create_general_page(self):
"""Create the General settings page."""
page = QWidget()
layout = QVBoxLayout()
# Startup Settings Group
startup_group = QGroupBox("Startup Settings")
startup_layout = QVBoxLayout()
# startup_layout.addWidget(QCheckBox("Start application on system startup"))
check_for_updates_checkbox = QCheckBox("Check for updates automatically")
check_for_updates_checkbox.setChecked(settings.value("auto_check_for_updates", True, type=bool))
check_for_updates_checkbox.stateChanged.connect(lambda state: settings.setValue("auto_check_for_updates", bool(state)))
startup_layout.addWidget(check_for_updates_checkbox)
startup_group.setLayout(startup_layout)
# Local Files Group
data_path = Path(Config.upload_folder).expanduser()
path_size = sum(f.stat().st_size for f in Path(data_path).rglob('*') if f.is_file())
database_group = QGroupBox("Local Files")
database_layout = QVBoxLayout()
database_layout.addWidget(QLabel(f"Local Directory: {data_path}"))
database_layout.addWidget(QLabel(f"Size: {humanize.naturalsize(path_size, binary=True)}"))
open_database_path_button = QPushButton("Open Directory")
open_database_path_button.clicked.connect(lambda: launch_url(data_path))
open_database_path_button.setFixedWidth(200)
database_layout.addWidget(open_database_path_button)
database_group.setLayout(database_layout)
# Render Settings Group
render_settings_group = QGroupBox("Render Engine Settings")
render_settings_layout = QVBoxLayout()
render_settings_layout.addWidget(QLabel("Restrict to render nodes with same:"))
require_same_engine_checkbox = QCheckBox("Renderer Version")
require_same_engine_checkbox.setChecked(settings.value("render_require_same_engine_version", False, type=bool))
require_same_engine_checkbox.stateChanged.connect(lambda state: settings.setValue("render_require_same_engine_version", bool(state)))
render_settings_layout.addWidget(require_same_engine_checkbox)
require_same_cpu_checkbox = QCheckBox("CPU Architecture")
require_same_cpu_checkbox.setChecked(settings.value("render_require_same_cpu_type", False, type=bool))
require_same_cpu_checkbox.stateChanged.connect(lambda state: settings.setValue("render_require_same_cpu_type", bool(state)))
render_settings_layout.addWidget(require_same_cpu_checkbox)
require_same_os_checkbox = QCheckBox("Operating System")
require_same_os_checkbox.setChecked(settings.value("render_require_same_os", False, type=bool))
require_same_os_checkbox.stateChanged.connect(lambda state: settings.setValue("render_require_same_os", bool(state)))
render_settings_layout.addWidget(require_same_os_checkbox)
render_settings_group.setLayout(render_settings_layout)
layout.addWidget(startup_group)
layout.addWidget(database_group)
layout.addWidget(render_settings_group)
layout.addStretch() # Add a stretch to push content to the top
page.setLayout(layout)
return page
def create_network_page(self):
"""Create the Network settings page."""
page = QWidget()
layout = QVBoxLayout()
# Sharing Settings Group
sharing_group = QGroupBox("Sharing Settings")
sharing_layout = QVBoxLayout()
enable_sharing_checkbox = QCheckBox("Enable other computers on the network to render to this machine")
enable_sharing_checkbox.setChecked(settings.value("enable_network_sharing", False, type=bool))
enable_sharing_checkbox.stateChanged.connect(self.toggle_render_sharing)
sharing_layout.addWidget(enable_sharing_checkbox)
password_enabled = (settings.value("enable_network_sharing", False, type=bool) and
settings.value("enable_network_password", False, type=bool))
password_layout = QHBoxLayout()
password_layout.setContentsMargins(0, 0, 0, 0)
self.enable_network_password_checkbox = QCheckBox("Enable network password:")
self.enable_network_password_checkbox.setChecked(settings.value("enable_network_password", False, type=bool))
self.enable_network_password_checkbox.stateChanged.connect(self.enable_network_password_changed)
self.enable_network_password_checkbox.setEnabled(settings.value("enable_network_sharing", False, type=bool))
sharing_layout.addWidget(self.enable_network_password_checkbox)
self.network_password_line = QLineEdit()
self.network_password_line.setPlaceholderText("Enter a password")
self.network_password_line.setEchoMode(QLineEdit.EchoMode.Password)
self.network_password_line.setEnabled(password_enabled)
password_layout.addWidget(self.network_password_line)
self.show_password_button = QPushButton("Show")
self.show_password_button.setEnabled(password_enabled)
self.show_password_button.clicked.connect(self.show_password_button_pressed)
password_layout.addWidget(self.show_password_button)
sharing_layout.addLayout(password_layout)
sharing_group.setLayout(sharing_layout)
layout.addWidget(sharing_group)
layout.addStretch() # Add a stretch to push content to the top
page.setLayout(layout)
return page
def toggle_render_sharing(self, enable_sharing):
settings.setValue("enable_network_sharing", enable_sharing)
self.enable_network_password_checkbox.setEnabled(enable_sharing)
enable_password = enable_sharing and settings.value("enable_network_password", False, type=bool)
self.network_password_line.setEnabled(enable_password)
self.show_password_button.setEnabled(enable_password)
def enable_network_password_changed(self, new_value):
settings.setValue("enable_network_password", new_value)
self.network_password_line.setEnabled(new_value)
self.show_password_button.setEnabled(new_value)
def show_password_button_pressed(self):
# toggle showing / hiding the password
show_pass = self.show_password_button.text() == "Show"
self.show_password_button.setText("Hide" if show_pass else "Show")
self.network_password_line.setEchoMode(QLineEdit.EchoMode.Normal if show_pass else QLineEdit.EchoMode.Password)
def create_engines_page(self):
"""Create the Engines settings page."""
page = QWidget()
layout = QVBoxLayout()
# Installed Engines Group
installed_group = QGroupBox("Installed Engines")
installed_layout = QVBoxLayout()
# Setup table
self.installed_engines_table = EngineTableWidget()
self.installed_engines_table.row_selected.connect(self.engine_table_selected)
installed_layout.addWidget(self.installed_engines_table)
# Ignore system installs
engine_ignore_system_installs_checkbox = QCheckBox("Ignore system installs")
engine_ignore_system_installs_checkbox.setChecked(settings.value("engines_ignore_system_installs", False, type=bool))
engine_ignore_system_installs_checkbox.stateChanged.connect(self.change_ignore_system_installs)
installed_layout.addWidget(engine_ignore_system_installs_checkbox)
# Engine Launch / Delete buttons
installed_buttons_layout = QHBoxLayout()
self.launch_engine_button = QPushButton("Launch")
self.launch_engine_button.setEnabled(False)
self.launch_engine_button.clicked.connect(self.launch_selected_engine)
self.delete_engine_button = QPushButton("Delete")
self.delete_engine_button.setEnabled(False)
self.delete_engine_button.clicked.connect(self.delete_selected_engine)
installed_buttons_layout.addWidget(self.launch_engine_button)
installed_buttons_layout.addWidget(self.delete_engine_button)
installed_layout.addLayout(installed_buttons_layout)
installed_group.setLayout(installed_layout)
# Engine Updates Group
engine_updates_group = QGroupBox("Auto-Install")
engine_updates_layout = QVBoxLayout()
engine_download_layout = QHBoxLayout()
engine_download_layout.addWidget(QLabel("Enable Downloads for:"))
at_least_one_downloadable = False
for engine in EngineManager.downloadable_engines():
engine_download_check = QCheckBox(engine.name())
is_checked = settings.value(f"engine_download-{engine.name()}", False, type=bool)
at_least_one_downloadable |= is_checked
engine_download_check.setChecked(is_checked)
# Capture the checkbox correctly using a default argument in lambda
engine_download_check.clicked.connect(
lambda state, checkbox=engine_download_check: self.engine_download_settings_changed(state, checkbox.text())
)
engine_download_layout.addWidget(engine_download_check)
engine_updates_layout.addLayout(engine_download_layout)
self.check_for_engine_updates_checkbox = QCheckBox("Check for new versions on launch")
self.check_for_engine_updates_checkbox.setChecked(settings.value('check_for_engine_updates_on_launch', True, type=bool))
self.check_for_engine_updates_checkbox.setEnabled(at_least_one_downloadable)
self.check_for_engine_updates_checkbox.stateChanged.connect(
lambda state: settings.setValue("check_for_engine_updates_on_launch", bool(state)))
engine_updates_layout.addWidget(self.check_for_engine_updates_checkbox)
self.engines_last_update_label = QLabel()
self.update_last_checked_label()
self.engines_last_update_label.setEnabled(at_least_one_downloadable)
engine_updates_layout.addWidget(self.engines_last_update_label)
self.engine_download_progress_bar = QProgressBar()
engine_updates_layout.addWidget(self.engine_download_progress_bar)
self.engine_download_progress_bar.setHidden(True)
self.check_for_new_engines_button = QPushButton("Check for New Versions...")
self.check_for_new_engines_button.setEnabled(at_least_one_downloadable)
self.check_for_new_engines_button.clicked.connect(self.check_for_new_engines)
engine_updates_layout.addWidget(self.check_for_new_engines_button)
engine_updates_group.setLayout(engine_updates_layout)
layout.addWidget(installed_group)
layout.addWidget(engine_updates_group)
layout.addStretch() # Add a stretch to push content to the top
page.setLayout(layout)
return page
def change_ignore_system_installs(self, value):
settings.setValue("engines_ignore_system_installs", bool(value))
self.installed_engines_table.update_engines_table()
def update_last_checked_label(self):
"""Retrieve the last check timestamp and return a human-friendly string."""
last_checked_str = settings.value("engines_last_update_time", None)
if not last_checked_str:
time_string = "Never"
else:
last_checked_dt = datetime.fromisoformat(last_checked_str)
now = datetime.now()
time_string = humanize.naturaltime(now - last_checked_dt)
self.engines_last_update_label.setText(f"Last Updated: {time_string}")
def engine_download_settings_changed(self, state, engine_name):
settings.setValue(f"engine_download-{engine_name}", state)
at_least_one_downloadable = False
for engine in EngineManager.downloadable_engines():
at_least_one_downloadable |= settings.value(f"engine_download-{engine.name()}", False, type=bool)
self.check_for_new_engines_button.setEnabled(at_least_one_downloadable)
self.check_for_engine_updates_checkbox.setEnabled(at_least_one_downloadable)
self.engines_last_update_label.setEnabled(at_least_one_downloadable)
def delete_selected_engine(self):
engine_info = self.installed_engines_table.selected_engine_data()
reply = QMessageBox.question(self, f"Delete {engine_info['engine']} {engine_info['version']}?",
f"Do you want to delete {engine_info['engine']} {engine_info['version']}?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No)
if reply is not QMessageBox.StandardButton.Yes:
return
delete_result = EngineManager.delete_engine_download(engine_info.get('engine'),
engine_info.get('version'),
engine_info.get('system_os'),
engine_info.get('cpu'))
self.installed_engines_table.update_engines_table(use_cached=False)
if delete_result:
QMessageBox.information(self, f"{engine_info['engine']} {engine_info['version']} Deleted",
f"{engine_info['engine']} {engine_info['version']} deleted successfully",
QMessageBox.StandardButton.Ok)
else:
QMessageBox.warning(self, f"Unknown Error",
f"Unknown error while deleting {engine_info['engine']} {engine_info['version']}.",
QMessageBox.StandardButton.Ok)
def launch_selected_engine(self):
engine_info = self.installed_engines_table.selected_engine_data()
if engine_info:
launch_url(engine_info['path'])
def engine_table_selected(self):
engine_data = self.installed_engines_table.selected_engine_data()
if engine_data:
self.launch_engine_button.setEnabled(bool(engine_data.get('path') or True))
self.delete_engine_button.setEnabled(engine_data.get('type') == 'managed')
else:
self.launch_engine_button.setEnabled(False)
self.delete_engine_button.setEnabled(False)
def check_for_new_engines(self):
ignore_system = settings.value("engines_ignore_system_installs", False, type=bool)
messagebox_shown = False
for engine in EngineManager.downloadable_engines():
if settings.value(f'engine_download-{engine.name()}', False, type=bool):
result = EngineManager.is_engine_update_available(engine, ignore_system_installs=ignore_system)
if result:
result['name'] = engine.name()
msg_box = QMessageBox()
msg_box.setWindowTitle(f"{result['name']} ({result['version']}) Available")
msg_box.setText(f"A new version of {result['name']} is available ({result['version']}).\n\n"
f"Would you like to download it now?")
msg_box.setIcon(QMessageBox.Icon.Question)
msg_box.setStandardButtons(QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No)
msg_result = msg_box.exec()
messagebox_shown = True
if msg_result == QMessageBox.StandardButton.Yes:
EngineManager.download_engine(engine_name=engine.name(), version=result['version'], background=True,
ignore_system=ignore_system)
self.engine_download_progress_bar.setHidden(False)
self.engine_download_progress_bar.setValue(0)
self.engine_download_progress_bar.setMaximum(100)
self.check_for_new_engines_button.setEnabled(False)
self.timer.start(1000)
if not messagebox_shown:
msg_box = QMessageBox()
msg_box.setWindowTitle("No Updates Available")
msg_box.setText("No Updates Available.")
msg_box.setIcon(QMessageBox.Icon.Information)
msg_box.setStandardButtons(QMessageBox.StandardButton.Ok)
msg_box.exec()
settings.setValue("engines_last_update_time", datetime.now().isoformat())
self.update_engine_download_status()
def update_engine_download_status(self):
running_tasks = EngineManager.active_downloads()
if not running_tasks:
self.timer.stop()
self.engine_download_progress_bar.setHidden(True)
self.installed_engines_table.update_engines_table(use_cached=False)
self.update_last_checked_label()
self.check_for_new_engines_button.setEnabled(True)
return
percent_complete = int(running_tasks[0].percent_complete * 100)
self.engine_download_progress_bar.setValue(percent_complete)
if percent_complete == 100:
status_update = f"Installing {running_tasks[0].engine.capitalize()} {running_tasks[0].version}..."
else:
status_update = f"Downloading {running_tasks[0].engine.capitalize()} {running_tasks[0].version}..."
self.engines_last_update_label.setText(status_update)
class EngineTableWidget(QWidget):
"""
The EngineTableWidget class displays a table of installed engines.
Attributes:
table: A table widget displaying engine information.
Methods:
on_selection_changed(self): Emits a signal when the user selects a different row in the table.
"""
row_selected = Signal()
def __init__(self):
super().__init__()
self.__get_engine_info_worker = None
self.table = QTableWidget(0, 4)
self.table.setHorizontalHeaderLabels(["Engine", "Version", "Type", "Path"])
self.table.setSelectionBehavior(QAbstractItemView.SelectionBehavior.SelectRows)
self.table.verticalHeader().setVisible(False)
# self.table_widget.itemSelectionChanged.connect(self.engine_picked)
self.table.setEditTriggers(QAbstractItemView.EditTrigger.NoEditTriggers)
self.table.selectionModel().selectionChanged.connect(self.on_selection_changed)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(0)
layout.addWidget(self.table)
self.raw_server_data = None
def showEvent(self, event):
"""Runs when the widget is about to be shown."""
self.update_engines_table()
super().showEvent(event) # Ensure normal event processing
def engine_data_ready(self, raw_server_data):
self.raw_server_data = raw_server_data
self.update_engines_table()
def update_engines_table(self, use_cached=True):
if not self.raw_server_data or not use_cached:
self.__get_engine_info_worker = GetEngineInfoWorker(self)
self.__get_engine_info_worker.done.connect(self.engine_data_ready)
self.__get_engine_info_worker.start()
if not self.raw_server_data:
return
table_data = [] # convert the data into a flat list
for _, engine_data in self.raw_server_data.items():
table_data.extend(engine_data['versions'])
if settings.value("engines_ignore_system_installs", False, type=bool):
table_data = [x for x in table_data if x['type'] != 'system']
self.table.clear()
self.table.setRowCount(len(table_data))
self.table.setColumnCount(4)
self.table.setHorizontalHeaderLabels(['Engine', 'Version', 'Type', 'Path'])
self.table.horizontalHeader().setSectionResizeMode(0, QHeaderView.ResizeMode.Fixed)
self.table.horizontalHeader().setSectionResizeMode(1, QHeaderView.ResizeMode.Fixed)
self.table.horizontalHeader().setSectionResizeMode(2, QHeaderView.ResizeMode.Fixed)
self.table.horizontalHeader().setSectionResizeMode(3, QHeaderView.ResizeMode.Stretch)
for row, engine in enumerate(table_data):
self.table.setItem(row, 0, QTableWidgetItem(engine['engine']))
self.table.setItem(row, 1, QTableWidgetItem(engine['version']))
self.table.setItem(row, 2, QTableWidgetItem(engine['type']))
self.table.setItem(row, 3, QTableWidgetItem(engine['path']))
self.table.selectRow(0)
def selected_engine_data(self):
"""Returns the data from the selected row as a dictionary."""
row = self.table.currentRow() # Get the selected row index
if row < 0 or not len(self.table.selectedItems()): # No row selected
return None
data = {
"engine": self.table.item(row, 0).text(),
"version": self.table.item(row, 1).text(),
"type": self.table.item(row, 2).text(),
"path": self.table.item(row, 3).text(),
}
return data
def on_selection_changed(self):
self.row_selected.emit()
if __name__ == "__main__":
app = QApplication([])
window = SettingsWindow()
window.show()
app.exec()

1
src/ui/widgets/dialog.py Normal file
View File

@@ -0,0 +1 @@
''' app/ui/widgets/dialog.py '''

View File

@@ -1,6 +1,5 @@
''' app/ui/widgets/menubar.py '''
from PyQt6.QtGui import QAction
from PyQt6.QtWidgets import QMenuBar, QApplication, QMessageBox, QDialog, QVBoxLayout, QLabel, QPushButton
from PyQt6.QtWidgets import QMenuBar
class MenuBar(QMenuBar):
@@ -13,93 +12,12 @@ class MenuBar(QMenuBar):
def __init__(self, parent=None) -> None:
super().__init__(parent)
self.settings_window = None
# setup menus
file_menu = self.addMenu("File")
# edit_menu = self.addMenu("Edit")
# view_menu = self.addMenu("View")
help_menu = self.addMenu("Help")
# help_menu = self.addMenu("Help")
# --file menu--
# new job
new_job_action = QAction("New Job...", self)
new_job_action.setShortcut(f'Ctrl+N')
new_job_action.triggered.connect(self.new_job)
file_menu.addAction(new_job_action)
# settings
settings_action = QAction("Settings...", self)
settings_action.triggered.connect(self.show_settings)
settings_action.setShortcut(f'Ctrl+,')
file_menu.addAction(settings_action)
# exit
exit_action = QAction('&Exit', self)
exit_action.setShortcut('Ctrl+Q')
exit_action.triggered.connect(QApplication.instance().quit)
file_menu.addAction(exit_action)
# --help menu--
about_action = QAction("About", self)
about_action.triggered.connect(self.show_about)
help_menu.addAction(about_action)
update_action = QAction("Check for Updates...", self)
update_action.triggered.connect(self.check_for_updates)
help_menu.addAction(update_action)
def new_job(self):
self.parent().new_job()
def show_settings(self):
from src.ui.settings_window import SettingsWindow
self.settings_window = SettingsWindow()
self.settings_window.show()
@staticmethod
def show_about():
from src.ui.about_window import AboutDialog
dialog = AboutDialog()
dialog.exec()
@staticmethod
def check_for_updates():
from src.utilities.misc_helper import check_for_updates
from src.version import APP_NAME, APP_VERSION, APP_REPO_NAME, APP_REPO_OWNER
found_update = check_for_updates(APP_REPO_NAME, APP_REPO_OWNER, APP_NAME, APP_VERSION)
if found_update:
dialog = UpdateDialog(found_update, APP_VERSION)
dialog.exec()
else:
QMessageBox.information(None, "No Update", "No updates available.")
class UpdateDialog(QDialog):
def __init__(self, release_info, current_version, parent=None):
super().__init__(parent)
self.setWindowTitle(f"Update Available ({current_version} -> {release_info['tag_name']})")
layout = QVBoxLayout()
label = QLabel(f"A new version ({release_info['tag_name']}) is available! Current version: {current_version}")
layout.addWidget(label)
# Label to show the release notes
description = QLabel(release_info["body"])
layout.addWidget(description)
# Button to download the latest version
download_button = QPushButton(f"Download Latest Version ({release_info['tag_name']})")
download_button.clicked.connect(lambda: self.open_url(release_info["html_url"]))
layout.addWidget(download_button)
# OK button to dismiss the dialog
ok_button = QPushButton("Dismiss")
ok_button.clicked.connect(self.accept) # Close the dialog when clicked
layout.addWidget(ok_button)
self.setLayout(layout)
def open_url(self, url):
from PyQt6.QtCore import QUrl
from PyQt6.QtGui import QDesktopServices
QDesktopServices.openUrl(QUrl(url))
self.accept()
# Add actions to the menus
# file_menu.addAction(self.parent().topbar.actions_call["Open"]) # type: ignore
# file_menu.addAction(self.parent().topbar.actions_call["Save"]) # type: ignore
# file_menu.addAction(self.parent().topbar.actions_call["Exit"]) # type: ignore

View File

@@ -9,7 +9,6 @@ from PyQt6.QtGui import QPixmap
from PyQt6.QtWidgets import QStatusBar, QLabel
from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.utilities.misc_helper import resources_dir
@@ -29,27 +28,17 @@ class StatusBar(QStatusBar):
proxy = RenderServerProxy(socket.gethostname())
proxy.start_background_update()
image_names = {'Ready': 'GreenCircle.png', 'Offline': "RedSquare.png"}
last_update = None
# Check for status change every 1s on background thread
while True:
try:
# update status label - get download status
new_status = proxy.status()
active_downloads = EngineManager.active_downloads()
if active_downloads:
if len(active_downloads) == 1:
task = active_downloads[0]
new_status = f"{new_status} | Downloading {task.engine.capitalize()} {task.version}..."
else:
new_status = f"{new_status} | Downloading {len(active_downloads)} engines"
self.messageLabel.setText(new_status)
# update status image
new_status = proxy.status()
if new_status is not last_update:
new_image_name = image_names.get(new_status, 'Synchronize.png')
new_image_path = os.path.join(resources_dir(), new_image_name)
self.label.setPixmap((QPixmap(new_image_path).scaled(16, 16, Qt.AspectRatioMode.KeepAspectRatio)))
except RuntimeError: # ignore runtime errors during shutdown
pass
image_path = os.path.join(resources_dir(), 'icons', new_image_name)
self.label.setPixmap((QPixmap(image_path).scaled(16, 16, Qt.AspectRatioMode.KeepAspectRatio)))
self.messageLabel.setText(new_status)
last_update = new_status
time.sleep(1)
background_thread = threading.Thread(target=background_update,)
@@ -58,7 +47,7 @@ class StatusBar(QStatusBar):
# Create a label that holds an image
self.label = QLabel()
image_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'resources',
image_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'resources', 'icons',
'RedSquare.png')
pixmap = (QPixmap(image_path).scaled(16, 16, Qt.AspectRatioMode.KeepAspectRatio))
self.label.setPixmap(pixmap)

View File

@@ -1,78 +0,0 @@
import concurrent.futures
import os
import time
import logging
logger = logging.getLogger()
def cpu_workload(n):
# Simple arithmetic operation for workload
while n > 0:
n -= 1
return n
def cpu_benchmark(duration_seconds=10):
# Determine the number of available CPU cores
num_cores = os.cpu_count()
# Calculate workload per core, assuming a large number for the workload
workload_per_core = 10000000
# Record start time
start_time = time.time()
# Use ProcessPoolExecutor to utilize all CPU cores
with concurrent.futures.ProcessPoolExecutor() as executor:
# Launching tasks for each core
futures = [executor.submit(cpu_workload, workload_per_core) for _ in range(num_cores)]
# Wait for all futures to complete, with a timeout to limit the benchmark duration
concurrent.futures.wait(futures, timeout=duration_seconds)
# Record end time
end_time = time.time()
# Calculate the total number of operations (workload) done by all cores
total_operations = workload_per_core * num_cores
# Calculate the total time taken
total_time = end_time - start_time
# Calculate operations per second as the score
score = total_operations / total_time
score = score * 0.0001
return int(score)
def disk_io_benchmark(file_size_mb=100, filename='benchmark_test_file'):
write_speed = None
read_speed = None
# Measure write speed
start_time = time.time()
with open(filename, 'wb') as f:
f.write(os.urandom(file_size_mb * 1024 * 1024)) # Write random bytes to file
end_time = time.time()
write_time = end_time - start_time
write_speed = file_size_mb / write_time
# Measure read speed
start_time = time.time()
with open(filename, 'rb') as f:
content = f.read()
end_time = time.time()
read_time = end_time - start_time
read_speed = file_size_mb / read_time
# Cleanup
os.remove(filename)
logger.debug(f"Disk Write Speed: {write_speed:.2f} MB/s")
logger.debug(f"Disk Read Speed: {read_speed:.2f} MB/s")
return write_speed, read_speed
if __name__ == '__main__':
print(cpu_benchmark())
print(disk_io_benchmark())

View File

@@ -1,8 +1,5 @@
import os
from pathlib import Path
import yaml
from src.utilities.misc_helper import current_system_os, copy_directory_contents
class Config:
@@ -12,7 +9,7 @@ class Config:
max_content_path = 100000000
server_log_level = 'debug'
log_buffer_length = 250
worker_process_timeout = 120
subjob_connection_timeout = 120
flask_log_level = 'error'
flask_debug_enable = False
queue_eval_seconds = 1
@@ -25,51 +22,15 @@ class Config:
with open(config_path, 'r') as ymlfile:
cfg = yaml.safe_load(ymlfile)
cls.upload_folder = str(Path(cfg.get('upload_folder', cls.upload_folder)).expanduser())
cls.upload_folder = os.path.expanduser(cfg.get('upload_folder', cls.upload_folder))
cls.update_engines_on_launch = cfg.get('update_engines_on_launch', cls.update_engines_on_launch)
cls.max_content_path = cfg.get('max_content_path', cls.max_content_path)
cls.server_log_level = cfg.get('server_log_level', cls.server_log_level)
cls.log_buffer_length = cfg.get('log_buffer_length', cls.log_buffer_length)
cls.worker_process_timeout = cfg.get('worker_process_timeout', cls.worker_process_timeout)
cls.subjob_connection_timeout = cfg.get('subjob_connection_timeout', cls.subjob_connection_timeout)
cls.flask_log_level = cfg.get('flask_log_level', cls.flask_log_level)
cls.flask_debug_enable = cfg.get('flask_debug_enable', cls.flask_debug_enable)
cls.queue_eval_seconds = cfg.get('queue_eval_seconds', cls.queue_eval_seconds)
cls.port_number = cfg.get('port_number', cls.port_number)
cls.enable_split_jobs = cfg.get('enable_split_jobs', cls.enable_split_jobs)
cls.download_timeout_seconds = cfg.get('download_timeout_seconds', cls.download_timeout_seconds)
@classmethod
def config_dir(cls) -> Path:
# Set up the config path
if current_system_os() == 'macos':
local_config_path = Path('~/Library/Application Support/Zordon').expanduser()
elif current_system_os() == 'windows':
local_config_path = Path(os.environ['APPDATA']) / 'Zordon'
else:
local_config_path = Path('~/.config/Zordon').expanduser()
return local_config_path
@classmethod
def setup_config_dir(cls):
# Set up the config path
local_config_dir = cls.config_dir()
if os.path.exists(local_config_dir):
return
try:
# Create the local configuration directory
os.makedirs(local_config_dir)
# Determine the template path
resource_environment_path = os.environ.get('RESOURCEPATH')
if resource_environment_path:
template_path = Path(resource_environment_path) / 'config'
else:
template_path = Path(__file__).resolve().parents[2] / 'config'
# Copy contents from the template to the local configuration directory
copy_directory_contents(template_path, local_config_dir)
except Exception as e:
print(f"An error occurred while setting up the config directory: {e}")
raise

View File

@@ -4,18 +4,17 @@ from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
def image_sequence_to_video(source_glob_pattern, output_path, framerate=24, encoder="prores_ks", profile=4,
start_frame=1):
subprocess.run([FFMPEG.default_engine_path(), "-framerate", str(framerate), "-start_number",
str(start_frame), "-i", f"{source_glob_pattern}", "-c:v", encoder, "-profile:v", str(profile),
'-pix_fmt', 'yuva444p10le', output_path], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL,
check=True)
subprocess.run([FFMPEG.default_renderer_path(), "-framerate", str(framerate), "-start_number", str(start_frame), "-i",
f"{source_glob_pattern}", "-c:v", encoder, "-profile:v", str(profile), '-pix_fmt', 'yuva444p10le',
output_path], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, check=True)
def save_first_frame(source_path, dest_path, max_width=1280):
subprocess.run([FFMPEG.default_engine_path(), '-i', source_path, '-vf', f'scale={max_width}:-1',
subprocess.run([FFMPEG.default_renderer_path(), '-i', source_path, '-vf', f'scale={max_width}:-1',
'-vframes', '1', dest_path], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, check=True)
def generate_thumbnail(source_path, dest_path, max_width=240, fps=12):
subprocess.run([FFMPEG.default_engine_path(), '-i', source_path, '-vf',
subprocess.run([FFMPEG.default_renderer_path(), '-i', source_path, '-vf',
f"scale={max_width}:trunc(ow/a/2)*2,format=yuv420p", '-r', str(fps), '-c:v', 'libx264', '-preset',
'ultrafast', '-an', dest_path], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, check=True)

View File

@@ -1,44 +1,25 @@
import logging
import json
import os
import platform
import re
import shutil
import socket
import string
import subprocess
import sys
from datetime import datetime
from typing import Optional, List, Dict, Any
logger = logging.getLogger()
def launch_url(url: str) -> None:
logger = logging.getLogger(__name__)
if shutil.which('xdg-open'):
opener = 'xdg-open'
elif shutil.which('open'):
opener = 'open'
elif shutil.which('cmd'):
opener = 'start'
def launch_url(url):
if subprocess.run(['which', 'xdg-open'], capture_output=True).returncode == 0:
subprocess.run(['xdg-open', url]) # linux
elif subprocess.run(['which', 'open'], capture_output=True).returncode == 0:
subprocess.run(['open', url]) # macos
elif subprocess.run(['which', 'start'], capture_output=True).returncode == 0:
subprocess.run(['start', url]) # windows - need to validate this works
else:
error_message = f"No valid launchers found to launch URL: {url}"
logger.error(error_message)
raise OSError(error_message)
try:
if opener == 'start':
# For Windows, use 'cmd /c start'
subprocess.run(['cmd', '/c', 'start', url], shell=False)
else:
subprocess.run([opener, url])
except Exception as e:
logger.error(f"Failed to launch URL: {url}. Error: {e}")
logger.error(f"No valid launchers found to launch url: {url}")
def file_exists_in_mounts(filepath: str) -> Optional[str]:
def file_exists_in_mounts(filepath):
"""
Check if a file exists in any mounted directory.
It searches for the file in common mount points like '/Volumes', '/mnt', and '/media'.
@@ -53,9 +34,9 @@ def file_exists_in_mounts(filepath: str) -> Optional[str]:
path = os.path.normpath(path)
components = []
while True:
path, comp = os.path.split(path)
if comp:
components.append(comp)
path, component = os.path.split(path)
if component:
components.append(component)
else:
if path:
components.append(path)
@@ -79,19 +60,22 @@ def file_exists_in_mounts(filepath: str) -> Optional[str]:
return possible_mount_path
def get_time_elapsed(start_time: Optional[datetime] = None, end_time: Optional[datetime] = None) -> str:
def get_time_elapsed(start_time=None, end_time=None):
from string import Template
class DeltaTemplate(Template):
delimiter = "%"
def strfdelta(tdelta, fmt='%H:%M:%S'):
days = tdelta.days
d = {"D": tdelta.days}
hours, rem = divmod(tdelta.seconds, 3600)
minutes, seconds = divmod(rem, 60)
# Using f-strings for formatting
formatted_str = fmt.replace('%D', f'{days}')
formatted_str = formatted_str.replace('%H', f'{hours:02d}')
formatted_str = formatted_str.replace('%M', f'{minutes:02d}')
formatted_str = formatted_str.replace('%S', f'{seconds:02d}')
return formatted_str
d["H"] = '{:02d}'.format(hours)
d["M"] = '{:02d}'.format(minutes)
d["S"] = '{:02d}'.format(seconds)
t = DeltaTemplate(fmt)
return t.substitute(**d)
# calculate elapsed time
elapsed_time = None
@@ -106,10 +90,10 @@ def get_time_elapsed(start_time: Optional[datetime] = None, end_time: Optional[d
return elapsed_time_string
def get_file_size_human(file_path: str) -> str:
def get_file_size_human(file_path):
size_in_bytes = os.path.getsize(file_path)
# Convert size to a human-readable format
# Convert size to a human readable format
if size_in_bytes < 1024:
return f"{size_in_bytes} B"
elif size_in_bytes < 1024 ** 2:
@@ -122,273 +106,43 @@ def get_file_size_human(file_path: str) -> str:
return f"{size_in_bytes / 1024 ** 4:.2f} TB"
def current_system_os() -> str:
# Convert path to the appropriate format for the current platform
def system_safe_path(path):
if platform.system().lower() == "windows":
return os.path.normpath(path)
return path.replace("\\", "/")
def current_system_os():
return platform.system().lower().replace('darwin', 'macos')
def current_system_os_version() -> str:
return platform.release()
def current_system_os_version():
return platform.mac_ver()[0] if current_system_os() == 'macos' else platform.release().lower()
def current_system_cpu() -> str:
return platform.machine().lower().replace('amd64', 'x64')
def current_system_cpu():
# convert all x86 64 to "x64"
return platform.machine().lower().replace('amd64', 'x64').replace('x86_64', 'x64')
def current_system_cpu_brand() -> str:
"""Fast cross-platform CPU brand string"""
if sys.platform.startswith('darwin'): # macOS
try:
return subprocess.check_output(['sysctl', '-n', 'machdep.cpu.brand_string']).decode().strip()
except Exception:
pass
elif sys.platform.startswith('win'): # Windows
from winreg import HKEY_LOCAL_MACHINE, OpenKey, QueryValueEx
try:
# Open the registry key where Windows stores the CPU name
key = OpenKey(HKEY_LOCAL_MACHINE, r"HARDWARE\DESCRIPTION\System\CentralProcessor\0")
# The value name is "ProcessorNameString"
value, _ = QueryValueEx(key, "ProcessorNameString")
return value.strip() # Usually perfect, with full marketing name
except Exception:
# Fallback: sometimes the key is under a different index, try 1
try:
key = OpenKey(HKEY_LOCAL_MACHINE, r"HARDWARE\DESCRIPTION\System\CentralProcessor\1")
value, _ = QueryValueEx(key, "ProcessorNameString")
return value.strip()
except Exception:
return "Unknown CPU"
elif sys.platform.startswith('linux'):
try:
with open('/proc/cpuinfo') as f:
for line in f:
if line.startswith('model name'):
return line.split(':', 1)[1].strip()
except Exception:
pass
# Ultimate fallback
return platform.processor() or 'Unknown CPU'
def resources_dir() -> str:
return os.path.join(os.path.dirname(__file__), '..', '..', 'resources')
def resources_dir():
resources_directory = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))),
'resources')
return resources_directory
def copy_directory_contents(src_dir: str, dst_dir: str) -> None:
for item in os.listdir(src_dir):
src_path = os.path.join(src_dir, item)
dst_path = os.path.join(dst_dir, item)
if os.path.isdir(src_path):
shutil.copytree(src_path, dst_path)
else:
shutil.copy2(src_path, dst_path)
def config_dir():
config_directory = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))),
'config')
return config_directory
def check_for_updates(repo_name: str, repo_owner: str, app_name: str, current_version: str) -> Optional[Dict[str, Any]]:
def get_github_releases(owner, repo):
import requests
url = f"https://api.github.com/repos/{owner}/{repo}/releases"
try:
response = requests.get(url, timeout=3)
response.raise_for_status()
releases = response.json()
return releases
except Exception as e:
logger.error(f"Error checking for updates: {e}")
return []
releases = get_github_releases(repo_owner, repo_name)
if not releases:
return None
latest_version = releases[0]
latest_version_tag = latest_version['tag_name']
from packaging import version
if version.parse(latest_version_tag) > version.parse(current_version):
logger.info(f"Newer version of {app_name} available. "
f"Latest: {latest_version_tag}, Current: {current_version}")
return latest_version
return None
def is_localhost(comparison_hostname: str) -> bool:
return comparison_hostname in ['localhost', '127.0.0.1', socket.gethostname()]
def num_to_alphanumeric(num: int) -> str:
return string.ascii_letters[num % 26] + str(num // 26)
def get_gpu_info() -> List[Dict[str, Any]]:
"""Cross-platform GPU information retrieval"""
def get_windows_gpu_info():
"""Get GPU info on Windows"""
try:
result = subprocess.run(
['wmic', 'path', 'win32_videocontroller', 'get', 'name,AdapterRAM', '/format:list'],
capture_output=True, text=True, timeout=5
)
# Virtual adapters to exclude
virtual_adapters = [
'virtual', 'rdp', 'hyper-v', 'microsoft basic', 'basic display',
'vga compatible', 'dummy', 'nvfbc', 'nvencode'
]
gpus = []
current_gpu = None
for line in result.stdout.strip().split('\n'):
line = line.strip()
if not line:
continue
if line.startswith('Name='):
if current_gpu and current_gpu.get('name'):
gpus.append(current_gpu)
gpu_name = line.replace('Name=', '').strip()
# Skip virtual adapters
if any(virtual in gpu_name.lower() for virtual in virtual_adapters):
current_gpu = None
else:
current_gpu = {'name': gpu_name, 'memory': 'Integrated'}
elif line.startswith('AdapterRAM=') and current_gpu:
vram_bytes_str = line.replace('AdapterRAM=', '').strip()
if vram_bytes_str and vram_bytes_str != '0':
try:
vram_gb = int(vram_bytes_str) / (1024**3)
current_gpu['memory'] = round(vram_gb, 2)
except:
pass
if current_gpu and current_gpu.get('name'):
gpus.append(current_gpu)
return gpus if gpus else [{'name': 'Unknown GPU', 'memory': 'Unknown'}]
except Exception as e:
logger.error(f"Failed to get Windows GPU info: {e}")
return [{'name': 'Unknown GPU', 'memory': 'Unknown'}]
def get_macos_gpu_info():
"""Get GPU info on macOS (works with Apple Silicon)"""
try:
if current_system_cpu() == "arm64":
# don't bother with system_profiler with Apple ARM - we know its integrated
return [{'name': current_system_cpu_brand(), 'memory': 'Integrated'}]
result = subprocess.run(['system_profiler', 'SPDisplaysDataType', '-detailLevel', 'mini', '-json'],
capture_output=True, text=True, timeout=5)
data = json.loads(result.stdout)
gpus = []
displays = data.get('SPDisplaysDataType', [])
for display in displays:
if 'sppci_model' in display:
gpus.append({
'name': display.get('sppci_model', 'Unknown GPU'),
'memory': display.get('sppci_vram', 'Integrated'),
})
return gpus if gpus else [{'name': 'Apple GPU', 'memory': 'Integrated'}]
except Exception as e:
print(f"Failed to get macOS GPU info: {e}")
return [{'name': 'Unknown GPU', 'memory': 'Unknown'}]
def get_linux_gpu_info():
gpus = []
try:
# Run plain lspci and filter for GPU-related lines
output = subprocess.check_output(
["lspci"], universal_newlines=True, stderr=subprocess.DEVNULL
)
for line in output.splitlines():
if any(keyword in line.lower() for keyword in ["vga", "3d", "display"]):
# Extract the part after the colon (vendor + model)
if ":" in line:
name_part = line.split(":", 1)[1].strip()
# Clean up common extras like (rev xx) or (prog-if ...)
name = name_part.split("(")[0].split("controller:")[-1].strip()
vendor = "Unknown"
if "nvidia" in name.lower():
vendor = "NVIDIA"
elif "amd" in name.lower() or "ati" in name.lower():
vendor = "AMD"
elif "intel" in name.lower():
vendor = "Intel"
gpus.append({
"name": name,
"vendor": vendor,
"memory": "Unknown"
})
except FileNotFoundError:
print("lspci not found. Install pciutils: sudo apt install pciutils")
return []
except Exception as e:
print(f"Error running lspci: {e}")
return []
return gpus
system = platform.system()
if system == 'Darwin': # macOS
return get_macos_gpu_info()
elif system == 'Windows':
return get_windows_gpu_info()
else: # Assume Linux or other
return get_linux_gpu_info()
COMMON_RESOLUTIONS = {
# SD
"SD_480p": (640, 480),
"NTSC_DVD": (720, 480),
"PAL_DVD": (720, 576),
# HD
"HD_720p": (1280, 720),
"HD_900p": (1600, 900),
"HD_1080p": (1920, 1080),
# Cinema / Film
"2K_DCI": (2048, 1080),
"4K_DCI": (4096, 2160),
# UHD / Consumer
"UHD_4K": (3840, 2160),
"UHD_5K": (5120, 2880),
"UHD_8K": (7680, 4320),
# Ultrawide / Aspect Variants
"UW_1080p": (2560, 1080),
"UW_1440p": (3440, 1440),
"UW_5K": (5120, 2160),
# Mobile / Social
"VERTICAL_1080x1920": (1080, 1920),
"SQUARE_1080": (1080, 1080),
# Classic / Legacy
"VGA": (640, 480),
"SVGA": (800, 600),
"XGA": (1024, 768),
"WXGA": (1280, 800),
}
COMMON_FRAME_RATES = {
"23.976 (NTSC Film)": 23.976,
"24 (Cinema)": 24.0,
"25 (PAL)": 25.0,
"29.97 (NTSC)": 29.97,
"30": 30.0,
"48 (HFR Film)": 48.0,
"50 (PAL HFR)": 50.0,
"59.94": 59.94,
"60": 60.0,
"72": 72.0,
"90 (VR)": 90.0,
"120": 120.0,
"144 (Gaming)": 144.0,
"240 (HFR)": 240.0,
}
def is_localhost(comparison_hostname):
# this is necessary because socket.gethostname() does not always include '.local' - This is a sanitized comparison
try:
comparison_hostname = comparison_hostname.lower().replace('.local', '')
local_hostname = socket.gethostname().lower().replace('.local', '')
return comparison_hostname == local_hostname
except AttributeError:
return False

View File

@@ -1,200 +1,47 @@
import logging
import os
import zipfile
from concurrent.futures import ThreadPoolExecutor
import subprocess
import threading
import requests
from src.api.server_proxy import RenderServerProxy
from src.utilities.misc_helper import get_file_size_human
from src.utilities.zeroconf_server import ZeroconfServer
from src.utilities.ffmpeg_helper import generate_thumbnail, save_first_frame
logger = logging.getLogger()
def download_missing_frames_from_subjob(local_job, subjob_id, subjob_hostname):
success = True
try:
local_files = [os.path.basename(x) for x in local_job.file_list()]
subjob_proxy = RenderServerProxy(subjob_hostname)
subjob_files = subjob_proxy.get_job_files_list(job_id=subjob_id) or []
def generate_thumbnail_for_job(job, thumb_video_path, thumb_image_path, max_width=320):
for subjob_filename in subjob_files:
if subjob_filename not in local_files:
try:
logger.debug(f"Downloading new file '{subjob_filename}' from {subjob_hostname}")
local_save_path = os.path.join(os.path.dirname(local_job.output_path), subjob_filename)
subjob_proxy.download_job_file(job_id=subjob_id, job_filename=subjob_filename,
save_path=local_save_path)
logger.debug(f'Downloaded successfully - {local_save_path}')
except Exception as e:
logger.error(f"Error downloading file '{subjob_filename}' from {subjob_hostname}: {e}")
success = False
except Exception as e:
logger.exception(f'Uncaught exception while trying to download from subjob: {e}')
success = False
return success
# Simple thread to generate thumbs in background
def generate_thumb_thread(source):
in_progress_path = thumb_video_path + '_IN-PROGRESS'
subprocess.run(['touch', in_progress_path])
try:
logger.debug(f"Generating video thumbnail for {source}")
generate_thumbnail(source_path=source, dest_path=thumb_video_path, max_width=max_width)
except subprocess.CalledProcessError as err:
logger.error(f"Error generating video thumbnail for {source}: {err}")
try:
os.remove(in_progress_path)
except FileNotFoundError:
pass
def download_all_from_subjob(local_job, subjob_id, subjob_hostname):
"""
Downloads and extracts files from a completed subjob on a remote server.
# Determine best source file to use for thumbs
source_files = job.file_list() or [job.input_path]
if source_files:
video_formats = ['.mp4', '.mov', '.avi', '.mpg', '.mpeg', '.mxf', '.m4v', 'mkv']
image_formats = ['.jpg', '.png', '.exr']
Parameters:
local_job (BaseRenderWorker): The local parent job worker.
subjob_id (str or int): The ID of the subjob.
subjob_hostname (str): The hostname of the remote server where the subjob is located.
image_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in image_formats]
video_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in video_formats]
Returns:
bool: True if the files have been downloaded and extracted successfully, False otherwise.
"""
child_key = f'{subjob_id}@{subjob_hostname}'
logname = f"{local_job.id}:{child_key}"
zip_file_path = local_job.output_path + f'_{subjob_hostname}_{subjob_id}.zip'
# download zip file from server
try:
local_job.children[child_key]['download_status'] = 'working'
logger.info(f"Downloading completed subjob files from {subjob_hostname} to localhost")
RenderServerProxy(subjob_hostname).download_all_job_files(subjob_id, zip_file_path)
logger.info(f"File transfer complete for {logname} - Transferred {get_file_size_human(zip_file_path)}")
except Exception as e:
logger.error(f"Error downloading files from remote server: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return False
# extract zip
try:
logger.debug(f"Extracting zip file: {zip_file_path}")
extract_path = os.path.dirname(zip_file_path)
with zipfile.ZipFile(zip_file_path, 'r') as zip_ref:
zip_ref.extractall(extract_path)
logger.info(f"Successfully extracted zip to: {extract_path}")
os.remove(zip_file_path)
local_job.children[child_key]['download_status'] = 'complete'
except Exception as e:
logger.exception(f"Exception extracting zip file: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return local_job.children[child_key].get('download_status', None) == 'complete'
def distribute_server_work(start_frame, end_frame, available_servers, method='evenly'):
"""
Splits the frame range among available servers proportionally based on their performance (CPU count).
Args:
start_frame (int): The start frame number of the animation to be rendered.
end_frame (int): The end frame number of the animation to be rendered.
available_servers (list): A list of available server dictionaries. Each server dictionary should include
'hostname' and 'cpu_count' keys (see find_available_servers).
method (str, optional): Specifies the distribution method. Possible values are 'cpu_benchmark', 'cpu_count'
and 'evenly'.
Defaults to 'cpu_benchmark'.
Returns:
list: A list of server dictionaries where each dictionary includes the frame range and total number of
frames to be rendered by the server.
"""
# Calculate respective frames for each server
def divide_frames_by_cpu_count(frame_start, frame_end, servers):
total_frames = frame_end - frame_start + 1
total_cpus = sum(server['cpu_count'] for server in servers)
frame_ranges = {}
current_frame = frame_start
allocated_frames = 0
for i, server in enumerate(servers):
if i == len(servers) - 1: # if it's the last server
# Give all remaining frames to the last server
num_frames = total_frames - allocated_frames
else:
num_frames = round((server['cpu_count'] / total_cpus) * total_frames)
allocated_frames += num_frames
frame_end_for_server = current_frame + num_frames - 1
if current_frame <= frame_end_for_server:
frame_ranges[server['hostname']] = (current_frame, frame_end_for_server)
current_frame = frame_end_for_server + 1
return frame_ranges
def divide_frames_by_benchmark(frame_start, frame_end, servers):
def fetch_benchmark(server):
if (video_files or image_files) and not os.path.exists(thumb_image_path):
try:
benchmark = requests.get(f'http://{server["hostname"]}:{ZeroconfServer.server_port}'
f'/api/cpu_benchmark').text
server['cpu_benchmark'] = benchmark
logger.debug(f'Benchmark for {server["hostname"]}: {benchmark}')
except requests.exceptions.RequestException as e:
logger.error(f'Error fetching benchmark for {server["hostname"]}: {e}')
path_of_source = image_files[0] if image_files else video_files[0]
logger.debug(f"Generating image thumbnail for {path_of_source}")
save_first_frame(source_path=path_of_source, dest_path=thumb_image_path, max_width=max_width)
except Exception as e:
logger.error(f"Exception saving first frame: {e}")
# Number of threads to use (can adjust based on your needs or number of servers)
threads = len(servers)
with ThreadPoolExecutor(max_workers=threads) as executor:
executor.map(fetch_benchmark, servers)
total_frames = frame_end - frame_start + 1
total_performance = sum(int(server['cpu_benchmark']) for server in servers)
frame_ranges = {}
current_frame = frame_start
allocated_frames = 0
for i, server in enumerate(servers):
if i == len(servers) - 1: # if it's the last server
# Give all remaining frames to the last server
num_frames = total_frames - allocated_frames
else:
num_frames = round((int(server['cpu_benchmark']) / total_performance) * total_frames)
allocated_frames += num_frames
frame_end_for_server = current_frame + num_frames - 1
if current_frame <= frame_end_for_server:
frame_ranges[server['hostname']] = (current_frame, frame_end_for_server)
current_frame = frame_end_for_server + 1
return frame_ranges
def divide_frames_equally(frame_start, frame_end, servers):
frame_range = frame_end - frame_start + 1
frames_per_server = frame_range // len(servers)
leftover_frames = frame_range % len(servers)
frame_ranges = {}
current_start = frame_start
for i, server in enumerate(servers):
current_end = current_start + frames_per_server - 1
if leftover_frames > 0:
current_end += 1
leftover_frames -= 1
if current_start <= current_end:
frame_ranges[server['hostname']] = (current_start, current_end)
current_start = current_end + 1
return frame_ranges
if len(available_servers) == 1:
breakdown = {available_servers[0]['hostname']: (start_frame, end_frame)}
else:
logger.debug(f'Splitting between {len(available_servers)} servers by {method} method')
if method == 'evenly':
breakdown = divide_frames_equally(start_frame, end_frame, available_servers)
elif method == 'cpu_benchmark':
breakdown = divide_frames_by_benchmark(start_frame, end_frame, available_servers)
elif method == 'cpu_count':
breakdown = divide_frames_by_cpu_count(start_frame, end_frame, available_servers)
else:
raise ValueError(f"Invalid distribution method: {method}")
server_breakdown = [server for server in available_servers if breakdown.get(server['hostname']) is not None]
for server in server_breakdown:
server['frame_range'] = breakdown[server['hostname']]
server['total_frames'] = breakdown[server['hostname']][-1] - breakdown[server['hostname']][0] + 1
return server_breakdown
if video_files and not os.path.exists(thumb_video_path):
x = threading.Thread(target=generate_thumb_thread, args=(video_files[0],))
x.start()

View File

@@ -2,8 +2,7 @@ import logging
import socket
from pubsub import pub
from zeroconf import Zeroconf, ServiceInfo, ServiceBrowser, ServiceStateChange, NonUniqueNameException, \
NotRunningException
from zeroconf import Zeroconf, ServiceInfo, ServiceBrowser, ServiceStateChange, NonUniqueNameException
logger = logging.getLogger()
@@ -23,25 +22,17 @@ class ZeroconfServer:
cls.service_type = service_type
cls.server_name = server_name
cls.server_port = server_port
try: # Stop any previously running instances
socket.gethostbyname(socket.gethostname())
except socket.gaierror:
cls.stop()
@classmethod
def start(cls, listen_only=False):
if not cls.service_type:
raise RuntimeError("The 'configure' method must be run before starting the zeroconf server")
elif not listen_only:
logger.debug(f"Starting zeroconf service")
if not listen_only:
cls._register_service()
else:
logger.debug(f"Starting zeroconf service - Listen only mode")
cls._browse_services()
@classmethod
def stop(cls):
logger.debug("Stopping zeroconf service")
cls._unregister_service()
cls.zeroconf.close()
@@ -61,7 +52,7 @@ class ZeroconfServer:
cls.service_info = info
cls.zeroconf.register_service(info)
logger.info(f"Registered zeroconf service: {cls.service_info.name}")
except (NonUniqueNameException, socket.gaierror) as e:
except NonUniqueNameException as e:
logger.error(f"Error establishing zeroconf: {e}")
@classmethod
@@ -78,49 +69,40 @@ class ZeroconfServer:
@classmethod
def _on_service_discovered(cls, zeroconf, service_type, name, state_change):
try:
info = zeroconf.get_service_info(service_type, name)
hostname = name.split(f'.{cls.service_type}')[0]
logger.debug(f"Zeroconf: {hostname} {state_change}")
if service_type == cls.service_type:
if state_change == ServiceStateChange.Added or state_change == ServiceStateChange.Updated:
cls.client_cache[hostname] = info
else:
cls.client_cache.pop(hostname)
pub.sendMessage('zeroconf_state_change', hostname=hostname, state_change=state_change)
except NotRunningException:
pass
info = zeroconf.get_service_info(service_type, name)
logger.debug(f"Zeroconf: {name} {state_change}")
if service_type == cls.service_type:
if state_change == ServiceStateChange.Added or state_change == ServiceStateChange.Updated:
cls.client_cache[name] = info
else:
cls.client_cache.pop(name)
pub.sendMessage('zeroconf_state_change', hostname=name, state_change=state_change, info=info)
@classmethod
def found_hostnames(cls):
fetched_hostnames = [x.split(f'.{cls.service_type}')[0] for x in cls.client_cache.keys()]
local_hostname = socket.gethostname()
# Define a sort key function
def sort_key(hostname):
# Return 0 if it's the local hostname so it comes first, else return 1
return False if hostname == local_hostname else True
# Sort the list with the local hostname first
sorted_hostnames = sorted(cls.client_cache.keys(), key=sort_key)
sorted_hostnames = sorted(fetched_hostnames, key=sort_key)
return sorted_hostnames
@classmethod
def get_hostname_properties(cls, hostname):
server_info = cls.client_cache.get(hostname).properties
new_key = hostname + '.' + cls.service_type
server_info = cls.client_cache.get(new_key).properties
decoded_server_info = {key.decode('utf-8'): value.decode('utf-8') for key, value in server_info.items()}
return decoded_server_info
# Example usage:
if __name__ == "__main__":
import time
logging.basicConfig(level=logging.DEBUG)
ZeroconfServer.configure("_zordon._tcp.local.", "foobar.local", 8080)
try:
ZeroconfServer.start()
while True:
time.sleep(0.1)
except KeyboardInterrupt:
pass
input("Server running - Press enter to end")
finally:
ZeroconfServer.stop()

View File

@@ -1,8 +0,0 @@
APP_NAME = "Zordon"
APP_VERSION = "0.0.1"
APP_AUTHOR = "Brett Williams"
APP_DESCRIPTION = "Distributed Render Farm Tools"
APP_COPYRIGHT_YEAR = "2024"
APP_LICENSE = "MIT License"
APP_REPO_NAME = APP_NAME
APP_REPO_OWNER = "blw1138"

View File

Before

Width:  |  Height:  |  Size: 1.7 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

Before

Width:  |  Height:  |  Size: 995 B

After

Width:  |  Height:  |  Size: 995 B

View File

Before

Width:  |  Height:  |  Size: 81 KiB

After

Width:  |  Height:  |  Size: 81 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

View File

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 2.6 KiB

View File

Before

Width:  |  Height:  |  Size: 2.1 KiB

After

Width:  |  Height:  |  Size: 2.1 KiB

View File

Before

Width:  |  Height:  |  Size: 66 KiB

After

Width:  |  Height:  |  Size: 66 KiB

View File

@@ -0,0 +1,64 @@
const grid = new gridjs.Grid({
columns: [
{ data: (row) => row.id,
name: 'Thumbnail',
formatter: (cell) => gridjs.html(`<img src="/api/job/${cell}/thumbnail?video_ok" style='width: 200px; min-width: 120px;'>`),
sort: {enabled: false}
},
{ id: 'name',
name: 'Name',
data: (row) => row.name,
formatter: (name, row) => gridjs.html(`<a href="/ui/job/${row.cells[0].data}/full_details">${name}</a>`)
},
{ id: 'renderer', data: (row) => `${row.renderer}-${row.renderer_version}`, name: 'Renderer' },
{ id: 'priority', name: 'Priority' },
{ id: 'status',
name: 'Status',
data: (row) => row,
formatter: (cell, row) => gridjs.html(`
<span class="tag ${(cell.status == 'running') ? 'is-hidden' : ''} ${(cell.status == 'cancelled') ?
'is-warning' : (cell.status == 'error') ? 'is-danger' : (cell.status == 'not_started') ?
'is-light' : 'is-primary'}">${cell.status}</span>
<progress class="progress is-primary ${(cell.status != 'running') ? 'is-hidden': ''}"
value="${(parseFloat(cell.percent_complete) * 100.0)}" max="100">${cell.status}</progress>
`)},
{ id: 'time_elapsed', name: 'Time Elapsed' },
{ data: (row) => row.total_frames ?? 'N/A', name: 'Frame Count' },
{ id: 'client', name: 'Client'},
{ data: (row) => row.last_output ?? 'N/A',
name: 'Last Output',
formatter: (output, row) => gridjs.html(`<a href="/api/job/${row.cells[0].data}/logs">${output}</a>`)
},
{ data: (row) => row,
name: 'Commands',
formatter: (cell, row) => gridjs.html(`
<div class="field has-addons" style='white-space: nowrap; display: inline-block;'>
<button class="button is-info" onclick="window.location.href='/ui/job/${row.cells[0].data}/full_details';">
<span class="icon"><i class="fa-solid fa-info"></i></span>
</button>
<button class="button is-link" onclick="window.location.href='/api/job/${row.cells[0].data}/logs';">
<span class="icon"><i class="fa-regular fa-file-lines"></i></span>
</button>
<button class="button is-warning is-active ${(cell.status != 'running') ? 'is-hidden': ''}" onclick="window.location.href='/api/job/${row.cells[0].data}/cancel?confirm=True&redirect=True';">
<span class="icon"><i class="fa-solid fa-x"></i></span>
</button>
<button class="button is-success ${(cell.status != 'completed') ? 'is-hidden': ''}" onclick="window.location.href='/api/job/${row.cells[0].data}/download_all';">
<span class="icon"><i class="fa-solid fa-download"></i></span>
<span>${cell.file_count}</span>
</button>
<button class="button is-danger" onclick="window.location.href='/api/job/${row.cells[0].data}/delete?confirm=True&redirect=True'">
<span class="icon"><i class="fa-regular fa-trash-can"></i></span>
</button>
</div>
`),
sort: false
},
{ id: 'owner', name: 'Owner' }
],
autoWidth: true,
server: {
url: '/api/jobs',
then: results => results['jobs'],
},
sort: true,
}).render(document.getElementById('table'));

View File

@@ -0,0 +1,44 @@
document.addEventListener('DOMContentLoaded', () => {
// Functions to open and close a modal
function openModal($el) {
$el.classList.add('is-active');
}
function closeModal($el) {
$el.classList.remove('is-active');
}
function closeAllModals() {
(document.querySelectorAll('.modal') || []).forEach(($modal) => {
closeModal($modal);
});
}
// Add a click event on buttons to open a specific modal
(document.querySelectorAll('.js-modal-trigger') || []).forEach(($trigger) => {
const modal = $trigger.dataset.target;
const $target = document.getElementById(modal);
$trigger.addEventListener('click', () => {
openModal($target);
});
});
// Add a click event on various child elements to close the parent modal
(document.querySelectorAll('.modal-background, .modal-close, .modal-card-head .delete, .modal-card-foot .button') || []).forEach(($close) => {
const $target = $close.closest('.modal');
$close.addEventListener('click', () => {
closeModal($target);
});
});
// Add a keyboard event to close all modals
document.addEventListener('keydown', (event) => {
const e = event || window.event;
if (e.keyCode === 27) { // Escape key
closeAllModals();
}
});
});

View File

@@ -0,0 +1,48 @@
{% extends 'layout.html' %}
{% block body %}
<div class="container" style="text-align:center; width: 100%">
<br>
{% if media_url: %}
<video width="1280" height="720" controls>
<source src="{{media_url}}" type="video/mp4">
Your browser does not support the video tag.
</video>
{% elif job_status == 'Running': %}
<div style="width: 100%; height: 720px; position: relative; background: black; text-align: center; color: white;">
<img src="/static/images/gears.png" style="vertical-align: middle; width: auto; height: auto; position:absolute; margin: auto; top: 0; bottom: 0; left: 0; right: 0;">
<span style="height: auto; position:absolute; margin: auto; top: 58%; left: 0; right: 0; color: white; width: 60%">
<progress class="progress is-primary" value="{{job.worker_data()['percent_complete'] * 100}}" max="100" style="margin-top: 6px;" id="progress-bar">Rendering</progress>
Rendering in Progress - <span id="percent-complete">{{(job.worker_data()['percent_complete'] * 100) | int}}%</span>
<br>Time Elapsed: <span id="time-elapsed">{{job.worker_data()['time_elapsed']}}</span>
</span>
<script>
var startingStatus = '{{job.status.value}}';
function update_job() {
$.getJSON('/api/job/{{job.id}}', function(data) {
document.getElementById('progress-bar').value = (data.percent_complete * 100);
document.getElementById('percent-complete').innerHTML = (data.percent_complete * 100).toFixed(0) + '%';
document.getElementById('time-elapsed').innerHTML = data.time_elapsed;
if (data.status != startingStatus){
clearInterval(renderingTimer);
window.location.reload(true);
};
});
}
if (startingStatus == 'running'){
var renderingTimer = setInterval(update_job, 1000);
};
</script>
</div>
{% else %}
<div style="width: 100%; height: 720px; position: relative; background: black;">
<img src="/static/images/{{job_status}}.png" style="vertical-align: middle; width: auto; height: auto; position:absolute; margin: auto; top: 0; bottom: 0; left: 0; right: 0;">
<span style="height: auto; position:absolute; margin: auto; top: 58%; left: 0; right: 0; color: white;">
{{job_status}}
</span>
</div>
{% endif %}
<br>
{{detail_table|safe}}
</div>
{% endblock %}

View File

@@ -0,0 +1,8 @@
{% extends 'layout.html' %}
{% block body %}
<div class="container is-fluid" style="padding-top: 20px;">
<div id="table" class="table"></div>
</div>
<script src="/static/js/job_table.js"></script>
{% endblock %}

View File

@@ -0,0 +1,236 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>Zordon Dashboard</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bulma@0.9.4/css/bulma.min.css">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0/css/font-awesome.min.css">
<script src="https://cdn.jsdelivr.net/npm/jquery/dist/jquery.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/gridjs/dist/gridjs.umd.js"></script>
<link href="https://unpkg.com/gridjs/dist/theme/mermaid.min.css" rel="stylesheet" />
<script src="https://kit.fontawesome.com/698705d14d.js" crossorigin="anonymous"></script>
<script type="text/javascript" src="/static/js/modals.js"></script>
</head>
<body onload="rendererChanged(document.getElementById('renderer'))">
<nav class="navbar is-dark" role="navigation" aria-label="main navigation">
<div class="navbar-brand">
<a class="navbar-item" href="/">
<img src="/static/images/logo.png">
</a>
</div>
<div id="navbarBasicExample" class="navbar-menu">
<div class="navbar-start">
<a class="navbar-item" href="/">
Home
</a>
</div>
<div class="navbar-end">
<div class="navbar-item">
<button class="button is-primary js-modal-trigger" data-target="add-job-modal">
<span class="icon">
<i class="fa-solid fa-upload"></i>
</span>
<span>Submit Job</span>
</button>
</div>
</div>
</div>
</nav>
{% block body %}
{% endblock %}
<div id="add-job-modal" class="modal">
<!-- Start Add Form -->
<form id="submit_job" action="/api/add_job?redirect=True" method="POST" enctype="multipart/form-data">
<div class="modal-background"></div>
<div class="modal-card">
<header class="modal-card-head">
<p class="modal-card-title">Submit New Job</p>
<button class="delete" aria-label="close" type="button"></button>
</header>
<section class="modal-card-body">
<!-- File Uploader -->
<label class="label">Upload File</label>
<div id="file-uploader" class="file has-name is-fullwidth">
<label class="file-label">
<input class="file-input is-small" type="file" name="file">
<span class="file-cta">
<span class="file-icon">
<i class="fas fa-upload"></i>
</span>
<span class="file-label">
Choose a file…
</span>
</span>
<span class="file-name">
No File Uploaded
</span>
</label>
</div>
<br>
<script>
const fileInput = document.querySelector('#file-uploader input[type=file]');
fileInput.onchange = () => {
if (fileInput.files.length > 0) {
const fileName = document.querySelector('#file-uploader .file-name');
fileName.textContent = fileInput.files[0].name;
}
}
const presets = {
{% for preset in preset_list: %}
{{preset}}: {
name: '{{preset_list[preset]['name']}}',
renderer: '{{preset_list[preset]['renderer']}}',
args: '{{preset_list[preset]['args']}}',
},
{% endfor %}
};
function rendererChanged(ddl1) {
var renderers = {
{% for renderer in renderer_info: %}
{% if renderer_info[renderer]['supported_export_formats']: %}
{{renderer}}: [
{% for format in renderer_info[renderer]['supported_export_formats']: %}
'{{format}}',
{% endfor %}
],
{% endif %}
{% endfor %}
};
var selectedRenderer = ddl1.value;
var ddl3 = document.getElementById('preset_list');
ddl3.options.length = 0;
createOption(ddl3, '-Presets-', '');
for (var preset_name in presets) {
if (presets[preset_name]['renderer'] == selectedRenderer) {
createOption(ddl3, presets[preset_name]['name'], preset_name);
};
};
document.getElementById('raw_args').value = "";
var ddl2 = document.getElementById('export_format');
ddl2.options.length = 0;
var options = renderers[selectedRenderer];
for (i = 0; i < options.length; i++) {
createOption(ddl2, options[i], options[i]);
};
}
function createOption(ddl, text, value) {
var opt = document.createElement('option');
opt.value = value;
opt.text = text;
ddl.options.add(opt);
}
function addPresetTextToInput(presetfield, textfield) {
var p = presets[presetfield.value];
textfield.value = p['args'];
}
</script>
<!-- Renderer & Priority -->
<div class="field is-grouped">
<p class="control">
<label class="label">Renderer</label>
<span class="select">
<select id="renderer" name="renderer" onchange="rendererChanged(this)">
{% for renderer in renderer_info: %}
<option name="renderer" value="{{renderer}}">{{renderer}}</option>
{% endfor %}
</select>
</span>
</p>
<p class="control">
<label class="label">Client</label>
<span class="select">
<select name="client">
<option name="client" value="">First Available</option>
{% for client in render_clients: %}
<option name="client" value="{{client}}">{{client}}</option>
{% endfor %}
</select>
</span>
</p>
<p class="control">
<label class="label">Priority</label>
<span class="select">
<select name="priority">
<option name="priority" value="1">1</option>
<option name="priority" value="2" selected="selected">2</option>
<option name="priority" value="3">3</option>
</select>
</span>
</p>
</div>
<!-- Output Path -->
<label class="label">Output</label>
<div class="field has-addons">
<div class="control is-expanded">
<input class="input is-small" type="text" placeholder="Output Name" name="output_path" value="output.mp4">
</div>
<p class="control">
<span class="select is-small">
<select id="export_format" name="export_format">
<option value="ar">option</option>
</select>
</span>
</p>
</div>
<!-- Resolution -->
<!-- <label class="label">Resolution</label>-->
<!-- <div class="field is-grouped">-->
<!-- <p class="control">-->
<!-- <input class="input" type="text" placeholder="auto" maxlength="5" size="8" name="AnyRenderer-arg_x_resolution">-->
<!-- </p>-->
<!-- <label class="label"> x </label>-->
<!-- <p class="control">-->
<!-- <input class="input" type="text" placeholder="auto" maxlength="5" size="8" name="AnyRenderer-arg_y_resolution">-->
<!-- </p>-->
<!-- <label class="label"> @ </label>-->
<!-- <p class="control">-->
<!-- <input class="input" type="text" placeholder="auto" maxlength="3" size="5" name="AnyRenderer-arg_frame_rate">-->
<!-- </p>-->
<!-- <label class="label"> fps </label>-->
<!-- </div>-->
<label class="label">Command Line Arguments</label>
<div class="field has-addons">
<p class="control">
<span class="select is-small">
<select id="preset_list" onchange="addPresetTextToInput(this, document.getElementById('raw_args'))">
<option value="preset-placeholder">presets</option>
</select>
</span>
</p>
<p class="control is-expanded">
<input class="input is-small" type="text" placeholder="Args" id="raw_args" name="raw_args">
</p>
</div>
<!-- End Add Form -->
</section>
<footer class="modal-card-foot">
<input class="button is-link" type="submit"/>
<button class="button" type="button">Cancel</button>
</footer>
</div>
</form>
</div>
</body>
</html>

View File

@@ -0,0 +1,62 @@
<html>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<script>
$(function() {
$('#renderer').change(function() {
$('.render_settings').hide();
$('#' + $(this).val()).show();
});
});
</script>
<body>
<h3>Upload a file</h3>
<div>
<form action="/add_job" method="POST"
enctype="multipart/form-data">
<div>
<input type="file" name="file"/><br>
</div>
<input type="hidden" id="origin" name="origin" value="html">
<div id="client">
Render Client:
<select name="client">
{% for client in render_clients %}
<option value="{{client}}">{{client}}</option>
{% endfor %}
</select>
</div>
<div id="priority">
Priority:
<select name="priority">
<option value="1">1</option>
<option value="2" selected>2</option>
<option value="3">3</option>
</select>
</div>
<div>
<label for="renderer">Renderer:</label>
<select id="renderer" name="renderer">
{% for renderer in supported_renderers %}
<option value="{{renderer}}">{{renderer}}</option>
{% endfor %}
</select>
</div>
<div id="blender" class="render_settings" style="display:none">
Engine:
<select name="blender+engine">
<option value="CYCLES">Cycles</option>
<option value="BLENDER_EEVEE">Eevee</option>
</select>
</div>
<br>
<input type="submit"/>
</form>
</div>
</body>
</html>