73 Commits

Author SHA1 Message Date
Brett Williams 4da03e30a2 Update label 2025-03-01 09:41:56 -06:00
Brett Williams 4a566ec7c3 Network password settings WIP 2025-03-01 02:24:20 -06:00
Brett Williams 085d39fde8 Fix issue where icons were not loading 2025-03-01 01:38:22 -06:00
Brett Williams d5f1224c33 Improvements to Launch and Delete buttons 2025-03-01 01:19:58 -06:00
Brett Williams e97e3d74c8 Add ability to ignore system builds 2025-03-01 00:37:11 -06:00
Brett Williams 1af4169447 More WIP on Settings 2025-02-28 23:18:53 -06:00
Brett Williams ea728f7809 Added Local Files section to Settings 2025-02-28 22:48:37 -06:00
Brett Williams a4e6fca73d More WIP for the Settings panel 2025-02-28 22:18:57 -06:00
Brett Williams 9aafb5c0fb Initial commit for settings window 2025-02-28 18:35:32 -06:00
brett 2548280dcc Add check for available software updates (#118)
* Add feature to check github repo for available updates

* Add Check for Updates to Help menu
2024-08-24 12:12:30 -05:00
Brett Williams 98ab837057 Fix issue where API server could fail to start 2024-08-24 03:00:57 -05:00
Brett Williams 3fda87935e Only prevent launch if we find unrelated processes 2024-08-24 02:22:38 -05:00
Brett Williams e35a5a689c Make sure only one instance is running at a time 2024-08-24 01:35:50 -05:00
brett dea7574888 Rename create_executables.yml to create-executables.yml 2024-08-23 19:52:41 -05:00
brett a19db9fcf7 Fix issue with create_executables.yml 2024-08-23 19:51:56 -05:00
brett 80b0adb2ad Create executables for all platforms, not just Windows 2024-08-23 19:46:35 -05:00
brett 18873cec6f Only generate Windows binaries when releases are created 2024-08-23 19:37:34 -05:00
brett af6d6e1525 Document all the things! (#117)
Add lots of docstrings everywhere
2024-08-23 19:26:05 -05:00
brett 8bbf19cb30 Fix accidental readme rename 2024-08-23 19:17:03 -05:00
brett 6bdb488ce1 Add "Create Executable" GitHub action for Windows (#116) 2024-08-23 18:36:14 -05:00
brett e792698480 Merge pull request #114
* Better exception handling / error reporting for add job screen

* Don't supress exceptions for potentially long running functions in bl…

* Increase Blender pack_project_file timeout to 120s
2024-08-20 15:20:24 -05:00
Brett Williams 751d74ced3 Fix issue where Stop Job button would never show 2024-08-15 23:20:04 -05:00
brett e8a4692e0f Update README.md 2024-08-15 15:10:37 -05:00
brett 49ae5a55d9 Add About window and basic commands to MenuBar (#113)
* Initial commit for about_window.py

* Add some basic actions to the MenuBar

* Fix keyboard shortcuts

* Fix path to icon for Windows
2024-08-15 14:27:29 -05:00
Brett Williams d04d14446b Update main.spec to include version numbers on Windows 2024-08-15 11:41:36 -05:00
brett 81e79a1996 Prevent subprocesses from constantly opening windows on Windows (#109)
* Add subprocess.CREATE_NO_WINDOW to blender_engine.py

* Convert ffmpeg_engine.py to use CREATE_NO_WINDOW

* Cleanup Blender implementation

* Cleanup subprocesses in base_worker.py

* Cleanup subprocesses in base_engine.py

* Fix main.spec for Windows (optimize=2 broke it)
2024-08-13 22:16:03 -05:00
Brett Williams d30978bef0 Update main.spec for Windows support 2024-08-13 18:02:40 -05:00
brett e2333c4451 Fix processes not ending when stopped (#98)
* Fix processes not ending when stopped

* Fix error when removing a job

* Better error handling

* Refactored killprocess code and fixed windows support

* Improved error handling

* Add try to code that deletes project files

* Wait for the thread to finish after killing the process

* Don't try to stop process multiple times

* Misc cleanup
2024-08-13 11:16:31 -05:00
brett 94a40c46dc Add output file count validation (#97)
* Worker file_list ignores hidden files

* Add frame-count validation logic
2024-08-11 13:00:54 -05:00
Brett Williams 8104bd4e86 Add logic to download button in main window 2024-08-11 11:59:40 -05:00
Brett Williams 33adcac592 Code refactoring 2024-08-11 11:54:15 -05:00
Brett Williams d38e10ae9f Add tga, bmp, webp and webm to PreviewManager support 2024-08-10 21:23:06 -05:00
Brett Williams 19b01446ea Make renderer_info threaded again 2024-08-10 21:20:47 -05:00
brett e757506787 Parent creates local subjobs instead of truncating original (#95)
* Parent worker now creates subjob on local host and waits for it

* Improve wait_for_subjobs logic

* Fix setting end_time for base_worker

* API cleanup

* Code refactoring

* Cleanup
2024-08-10 21:19:01 -05:00
brett f9b51886ab Bugfix: Filter out corrupt engines by default (#94)
* Add main.spec

* Fix issue where fetching supported extensions would crash with no default installation

* Engines return version as 'error' if cannot determine version

* EngineManager will now filter out corrupted engine installs by default
2024-08-10 15:00:47 -05:00
brett 3b33649f2d Pyinstaller support (#93)
* Add main.spec

* Fix issue where fetching supported extensions would crash with no default installation
2024-08-10 14:58:41 -05:00
brett 51a5a63944 Use pubsub messages instead of a background thread to process changes (#92)
* Use pubsub messages instead of a background thread to process changes to the RenderQueue

* Misc logging improvements
2024-08-08 23:01:26 -05:00
brett 3600eeb21b Refactor: Move all initialization logic out of api_server and into init (#91)
* Zeroconf logging improvements

* Ignore RuntimeErrors in background threads - Prevents issues during shutdown

* Migrate start up code from api_server.py to init.py

* Add error handlers to the API server to handle detached instances

* Integrate RenderQueue eval loop into RenderQueue object

* Silently catch RuntimeErrors on evaluate_queue

* Stop background queue updates in prepare_for_shutdown
2024-08-08 04:47:22 -05:00
brett 6afb6e65a6 Integrate watchdog into render worker (#88)
* Add a watchdog to base_worker

* Logging cleanup

* Prevent multiple watchdogs from running if render process restarts

* Add process timeout parameter to Config

* Refactor

* Add error handling to process output parsing

* Fix issue where start_time was not getting set consistently
2024-08-06 10:48:24 -05:00
Brett Williams 90d5e9b7af Misc logging cleanup 2024-08-05 10:57:56 -05:00
brett 4df41a2079 Download frames from subjobs as frames are completed (#87)
* Add a frame complete notification to BaseWorker and distributed_job_manager.py

* Add API to download individual files to API server and ServerProxy

* Rename subjob notification API and add download_missing_frames_from_subjob

* Subjobs will now notify parent when a frame is complete

* Fix missed rename

* Add some misc logging

* Better error handling

* Fix frame download file path issue

* Download missing frames at job completion and misc cleanup

* Misc cleanup

* Code cleanup
2024-08-04 21:30:10 -05:00
brett 1cdb7810bf New PreviewManager to handle generating previews asynchronously (#86)
* Add PreviewManager

* Refactoring and better error handling

* Integrate PreviewManager into api_server.py

* Integrate PreviewManager into distributed_job_manager.py

* Add method to preview_manager.py to delete previews and integrate it into api_server

* Misc logging improvements

* Misc code cleanup

* Replace existing preview on job completion - Minor code fixes
2024-08-04 16:45:46 -05:00
Brett Williams 21011e47ca Fix issue where tests would never complete correctly 2024-08-04 11:48:36 -05:00
Brett Williams 86977b9d6d Fix issue where custom job name was being ignored 2024-08-04 11:47:56 -05:00
Brett Williams 220b3fcc25 Streamline job runtime - improve logging 2024-08-03 20:55:22 -05:00
brett 82613c3963 Persist args in db and return args in job json (#82) 2024-08-03 18:42:21 -05:00
Brett Williams abc9724f01 Quickfix: Forgot to commit one rename 2024-08-03 18:28:33 -05:00
brett ef4fc0e42e Blender GPU / CPU Render (#81)
* Add script to get GPU information from Blender

* Change run_python_script to allow it to run without a project file

* Simplify run_python_script code

* Fix mistake

* Add system_info to engine classes and api_server. /api/renderer_info now supports standard and full response modes.

* Get full renderer_info response for add job UI

* Enable setting specific Blender render_device using args

* Add Blender render device options to UI
2024-08-03 18:26:56 -05:00
Brett Williams 9bc490acae Misc cleanup / renaming 2024-08-03 14:04:17 -05:00
brett 21de69ca4f Improve performance on several API calls (#80)
* Streamline fetching renderer_info from API - use threading for performance improvements

* Use concurrent.futures instead of Threading

* Fix timeout issue with server proxy

* Minor fixes to code that handles proxy server online / offline status
2024-08-03 11:02:40 -05:00
Brett Williams 47770c4fdd Update blender worker to get current frame from filepath output 2024-07-30 20:00:07 -05:00
brett 8a3e74660c Create subjobs after submission - #54 (#79)
* Force start in render queue only starts NOT_STARTED and SCHEDULED jobs

* Refactor adding jobs / subjobs

* Remove dead code

* Fixed issue with bulk job submission

* Cancel job now cancels all subjobs

* Misc fixes

* JSON now returns job hostname

* Add hostname as optional column in DB

* Misc fixes

* Error handling for removing zip file after download

* Clean up imports

* Fixed issue where worker child information would not be saved
2024-07-30 19:22:38 -05:00
Brett Williams 6d33f262b3 Better error handling when posting a new job 2024-07-29 14:50:14 -05:00
brett a0729d71f1 Add long_polling_jobs to API (#78) 2024-02-13 13:11:56 -06:00
brett ecf836c235 Zeroconf offline-handling improvements (#77)
* Add benchmark.py

* Add cpu / disk benchmark APIs

* Add cpu_benchmark method to distributed_job_manager.py

* Do a better job of storing hostnames =

* Remove hostname from Zeroconf cache if server goes offline

* Add cpu / disk benchmark APIs

* Add cpu_benchmark method to distributed_job_manager.py

* Do a better job of storing hostnames =

* Remove hostname from Zeroconf cache if server goes offline

* Wrap main code in try finally block to always stop zeroconf

* Add missing import
2024-02-12 14:57:00 -06:00
brett a31fe98964 Cpu benchmarks #48 (#76)
* Add benchmark.py

* Add cpu / disk benchmark APIs

* Add cpu_benchmark method to distributed_job_manager.py

* Make sure cpu_benchmark is an int

* Improve distributed_job_manager test
2024-02-11 05:19:24 -06:00
Brett Williams 79db960383 Add MIT license 2024-01-28 20:54:12 -06:00
brett 85785d9167 Check engine permissions and chmod it to executable if not already (#75) 2024-01-28 10:53:14 -06:00
brett 9757ba9276 Pylint cleanup (#74)
* Misc fixes

* Misc cleanup

* Add all_versions to blender_downloader.py

* More cleanup

* Fix issue with status not reporting engine info

* Misc fixes

* Misc cleanup

* Add all_versions to blender_downloader.py

* More cleanup

* Fix issue with status not reporting engine info
2024-01-28 10:30:57 -06:00
brett d673d7d4bf Misc cleanup (#73)
* Stop previously running zeroconf instances

* Lots of formatting fixes

* Use f-strings for time delta

* More line fixes

* Update requirements.txt

* More misc cleanup

* Simplify README.md
2024-01-27 22:56:33 -06:00
Brett Williams d216ae822e Merge remote-tracking branch 'origin/master' 2023-12-25 17:47:03 -06:00
Brett Williams dabe46bdda Add .pylintrc 2023-12-25 17:46:45 -06:00
brett 2c82c65305 Update pylint.yml
Update python versions
2023-12-25 17:40:55 -06:00
Brett Williams 4004ad893b Update .gitignore 2023-12-21 20:47:38 -06:00
Brett Williams 685297e2f2 Use alphanumeric API tokens instead of ints 2023-12-21 20:46:55 -06:00
brett d55f6a5187 Remove web components (#70)
* Remove old web code

* Add back missing gears file

* Client fetches thumbnails instead of being sent by server
2023-12-17 12:07:10 -06:00
Brett Williams 8863a38904 Add more docstrings 2023-12-16 22:23:02 -06:00
brett f663430984 Fix py2app (#69)
* Initial commit of py2app code

* Use environment variable RESOURCE_PATH when running as a bundle

* Move config files to system config location
2023-12-16 22:20:24 -06:00
brett 525fd99a58 Ffmpeg versioning issues (#68)
* FFMPEG version cleanup

* Make sure attempts don't go on forever

* Use latest version when version not defined. Add latest to UI
2023-11-22 08:47:47 -08:00
Brett Williams 4847338fc2 Fix FFMPEG version regex 2023-11-22 07:48:28 -08:00
brett c0d0ec64a8 Dynamic engine options in UI for blender / ffmpeg (#66)
* Make sure progress UI updates occur on main thread

* Cleanup unnecessary code in FFMPEG

* Cleanup extension matching

* Make sure supported_extensions is now called as a method everywhere

* Fix add_job crashing

* Update the renderer to reflect the current file type

* Sort engine versions from newest to oldest

* Consolidate Project Group and Server Group

* Split UI options into its own file for easier updating

* Add ffmpeg ui stem
2023-11-21 01:31:56 -08:00
brett 32afcf945d Use loopback address for local host (fixes issue with locked down networks) (#65) 2023-11-21 01:16:26 -08:00
brett e9f9521924 Report Engine Download Status in UI (#64)
* Report downloads in status bar

* Update engine_browser.py UI with any active downloads
2023-11-20 19:58:31 -08:00
82 changed files with 3677 additions and 1915 deletions
+38
View File
@@ -0,0 +1,38 @@
name: Create Executables
on:
workflow_dispatch:
release:
- types: [created]
jobs:
pyinstaller-build-windows:
runs-on: windows-latest
steps:
- name: Create Executables (Windows)
uses: sayyid5416/pyinstaller@v1
with:
python_ver: '3.11'
spec: 'main.spec'
requirements: 'requirements.txt'
upload_exe_with_name: 'Zordon'
pyinstaller-build-linux:
runs-on: ubuntu-latest
steps:
- name: Create Executables (Linux)
uses: sayyid5416/pyinstaller@v1
with:
python_ver: '3.11'
spec: 'main.spec'
requirements: 'requirements.txt'
upload_exe_with_name: 'Zordon'
pyinstaller-build-macos:
runs-on: macos-latest
steps:
- name: Create Executables (macOS)
uses: sayyid5416/pyinstaller@v1
with:
python_ver: '3.11'
spec: 'main.spec'
requirements: 'requirements.txt'
upload_exe_with_name: 'Zordon'
-23
View File
@@ -1,23 +0,0 @@
name: Pylint
on: [push]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10"]
steps:
- uses: actions/checkout@v3
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install pylint
- name: Analysing the code with pylint
run: |
pylint $(git ls-files '*.py')
+3 -3
View File
@@ -1,8 +1,8 @@
/job_history.json
*.icloud *.icloud
*.fcpxml *.fcpxml
/uploads /uploads
*.pyc *.pyc
/server_state.json
/.scheduler_prefs
*.db *.db
/dist/
/build/
/.github/
+4
View File
@@ -0,0 +1,4 @@
[MASTER]
max-line-length = 120
[MESSAGES CONTROL]
disable = missing-docstring, invalid-name, import-error, logging-fstring-interpolation
+21
View File
@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2024 Brett Williams
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
+30 -16
View File
@@ -1,23 +1,37 @@
# 🎬 Zordon - Render Management Tools 🎬 # Zordon
Welcome to Zordon! This is a hobby project written with fellow filmmakers in mind. It's a local network render farm manager, aiming to streamline and simplify the rendering process across multiple home computers. A tool designed for small render farms, such as those used in home studios or small businesses, to efficiently manage and run render jobs for Blender, FFMPEG, and other video renderers. It simplifies the process of distributing rendering tasks across multiple available machines, optimizing the rendering workflow for artists, animators, and video professionals.
## 📦 Installation Notice: This should be considered a beta and is meant for casual / hobbiest use. Do not use in mission critical environments!
Make sure to install the necessary dependencies: `pip3 install -r requirements.txt` ## Supported Renderers
## 🚀 How to Use Zordon supports or plans to support the following renderers:
Zordon has two main files: `start_server.py` and `start_client.py`.
- **start_server.py**: Run this on any computer you want to render jobs. It manages the incoming job queue and kicks off the appropriate render jobs when ready.
- **start_client.py**: Run this to administer your render servers. It lets you manage and submit jobs.
When the server is running, the job queue can be accessed via a web browser on the server's hostname (default port is 8080). You can also access it via the GUI client or a simple view-only dashboard.
## 🎨 Supported Renderers
Zordon currently supports the following renderers:
- **Blender** - **Blender**
- **FFMPEG** - **FFMPEG**
- **Adobe After Effects** ([coming soon](https://github.com/blw1138/Zordon/issues/84))
- **Cinema 4D** ([planned](https://github.com/blw1138/Zordon/issues/105))
- **Autodesk Maya** ([planned](https://github.com/blw1138/Zordon/issues/106))
## System Requirements
- Windows 10 or later
- macOS Ventura (13.0) or later
- Linux (Supported versions TBD)
## Build using Pyinstaller
Zordon is regularly tested with Python 3.11 and later. It's packaged and distributed with pyinstaller. It is supported on Windows, macOS and Linux.
```
git clone https://github.com/blw1138/Zordon.git
pip3 install -r requirements.txt
pip3 install pyinstaller
pip3 install pyinstaller_versionfile
pyinstaller main.spec
```
## License
Zordon is licensed under the MIT License. See the [LICENSE](LICENSE.txt) file for more details.
+1 -1
View File
@@ -3,7 +3,7 @@ update_engines_on_launch: true
max_content_path: 100000000 max_content_path: 100000000
server_log_level: info server_log_level: info
log_buffer_length: 250 log_buffer_length: 250
subjob_connection_timeout: 120 worker_process_timeout: 120
flask_log_level: error flask_log_level: error
flask_debug_enable: false flask_debug_enable: false
queue_eval_seconds: 1 queue_eval_seconds: 1
+1
View File
@@ -1,4 +1,5 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from src import init from src import init
if __name__ == '__main__': if __name__ == '__main__':
+119
View File
@@ -0,0 +1,119 @@
# -*- mode: python ; coding: utf-8 -*-
from PyInstaller.utils.hooks import collect_all
# - get version from version file
import os
import sys
import platform
sys.path.insert(0, os.path.abspath('.'))
from version import APP_NAME, APP_VERSION, APP_AUTHOR
datas = [('resources', 'resources'), ('src/engines/blender/scripts/', 'src/engines/blender/scripts')]
binaries = []
hiddenimports = ['zeroconf']
tmp_ret = collect_all('zeroconf')
datas += tmp_ret[0]; binaries += tmp_ret[1]; hiddenimports += tmp_ret[2]
a = Analysis(
['main.py'],
pathex=[],
binaries=binaries,
datas=datas,
hiddenimports=hiddenimports,
hookspath=[],
hooksconfig={},
runtime_hooks=[],
excludes=[],
noarchive=False,
optimize=1, # fyi: optim level 2 breaks on windows
)
pyz = PYZ(a.pure)
if platform.system() == 'Darwin': # macOS
exe = EXE(
pyz,
a.scripts,
[],
exclude_binaries=True,
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
)
app = BUNDLE(
exe,
a.binaries,
a.datas,
strip=True,
name=f'{APP_NAME}.app',
icon=None,
bundle_identifier=None,
version=APP_VERSION
)
elif platform.system() == 'Windows':
import pyinstaller_versionfile
import tempfile
version_file_path = os.path.join(tempfile.gettempdir(), 'versionfile.txt')
pyinstaller_versionfile.create_versionfile(
output_file=version_file_path,
version=APP_VERSION,
company_name=APP_AUTHOR,
file_description=APP_NAME,
internal_name=APP_NAME,
legal_copyright=f"© {APP_AUTHOR}",
original_filename=f"{APP_NAME}.exe",
product_name=APP_NAME
)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.datas,
[],
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None,
version=version_file_path
)
else: # linux
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.datas,
[],
name=APP_NAME,
debug=False,
bootloader_ignore_signals=False,
strip=True,
upx=True,
console=False,
disable_windowed_traceback=False,
argv_emulation=False,
target_arch=None,
codesign_identity=None,
entitlements_file=None
)
+38 -15
View File
@@ -1,15 +1,38 @@
requests==2.31.0 PyQt6>=6.6.1
psutil==5.9.6 psutil>=5.9.8
PyYAML==6.0.1 requests>=2.31.0
Flask==3.0.0 Pillow>=10.2.0
rich==13.6.0 PyYAML>=6.0.1
Werkzeug~=3.0.1 flask>=3.0.2
json2html~=1.3.0 tqdm>=4.66.2
SQLAlchemy~=2.0.15 werkzeug>=3.0.1
Pillow==10.1.0 Pypubsub>=4.0.3
zeroconf==0.119.0 zeroconf>=0.131.0
Pypubsub~=4.0.3 SQLAlchemy>=2.0.25
tqdm==4.66.1 plyer>=2.1.0
plyer==2.1.0 pytz>=2023.3.post1
PyQt6~=6.6.0 future>=0.18.3
PySide6~=6.6.0 rich>=13.7.0
pytest>=8.0.0
numpy>=1.26.3
setuptools>=69.0.3
pandas>=2.2.0
matplotlib>=3.8.2
MarkupSafe>=2.1.4
dmglib>=0.9.5; sys_platform == 'darwin'
python-dateutil>=2.8.2
certifi>=2023.11.17
shiboken6>=6.6.1
Pygments>=2.17.2
cycler>=0.12.1
contourpy>=1.2.0
packaging>=23.2
fonttools>=4.47.2
Jinja2>=3.1.3
pyparsing>=3.1.1
kiwisolver>=1.4.5
attrs>=23.2.0
lxml>=5.1.0
click>=8.1.7
requests_toolbelt>=1.0.0
pyinstaller_versionfile>=2.1.1

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

Before

Width:  |  Height:  |  Size: 2.0 KiB

After

Width:  |  Height:  |  Size: 2.0 KiB

Before

Width:  |  Height:  |  Size: 6.1 KiB

After

Width:  |  Height:  |  Size: 6.1 KiB

Before

Width:  |  Height:  |  Size: 921 B

After

Width:  |  Height:  |  Size: 921 B

Before

Width:  |  Height:  |  Size: 476 B

After

Width:  |  Height:  |  Size: 476 B

Before

Width:  |  Height:  |  Size: 979 B

After

Width:  |  Height:  |  Size: 979 B

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Before

Width:  |  Height:  |  Size: 4.7 KiB

After

Width:  |  Height:  |  Size: 4.7 KiB

Before

Width:  |  Height:  |  Size: 2.2 KiB

After

Width:  |  Height:  |  Size: 2.2 KiB

Before

Width:  |  Height:  |  Size: 450 B

After

Width:  |  Height:  |  Size: 450 B

Before

Width:  |  Height:  |  Size: 1.3 KiB

After

Width:  |  Height:  |  Size: 1.3 KiB

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 2.5 KiB

Before

Width:  |  Height:  |  Size: 694 B

After

Width:  |  Height:  |  Size: 694 B

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

Before

Width:  |  Height:  |  Size: 1.8 KiB

After

Width:  |  Height:  |  Size: 1.8 KiB

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

Before

Width:  |  Height:  |  Size: 816 B

After

Width:  |  Height:  |  Size: 816 B

Before

Width:  |  Height:  |  Size: 1.7 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

Before

Width:  |  Height:  |  Size: 995 B

After

Width:  |  Height:  |  Size: 995 B

Before

Width:  |  Height:  |  Size: 1.7 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 2.6 KiB

Before

Width:  |  Height:  |  Size: 81 KiB

After

Width:  |  Height:  |  Size: 81 KiB

Before

Width:  |  Height:  |  Size: 2.1 KiB

After

Width:  |  Height:  |  Size: 2.1 KiB

Before

Width:  |  Height:  |  Size: 66 KiB

After

Width:  |  Height:  |  Size: 66 KiB

Before

Width:  |  Height:  |  Size: 806 B

After

Width:  |  Height:  |  Size: 806 B

+2 -2
View File
@@ -1,5 +1,5 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from src.api.api_server import start_server from init import run
if __name__ == '__main__': if __name__ == '__main__':
start_server() run(server_only=True)
+34 -63
View File
@@ -10,14 +10,28 @@ import requests
from tqdm import tqdm from tqdm import tqdm
from werkzeug.utils import secure_filename from werkzeug.utils import secure_filename
from src.distributed_job_manager import DistributedJobManager
from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue
logger = logging.getLogger() logger = logging.getLogger()
def handle_uploaded_project_files(request, jobs_list, upload_directory): def handle_uploaded_project_files(request, jobs_list, upload_directory):
"""
Handles the uploaded project files.
This method takes a request with a file, a list of jobs, and an upload directory. It checks if the file was uploaded
directly, if it needs to be downloaded from a URL, or if it's already present on the local file system. It then
moves the file to the appropriate directory and returns the local path to the file and its name.
Args:
request (Request): The request object containing the file.
jobs_list (list): A list of jobs. The first job in the list is used to get the file's URL and local path.
upload_directory (str): The directory where the file should be uploaded.
Raises:
ValueError: If no valid project paths are found.
Returns:
tuple: A tuple containing the local path to the loaded project file and its name.
"""
# Initialize default values # Initialize default values
loaded_project_local_path = None loaded_project_local_path = None
@@ -35,12 +49,11 @@ def handle_uploaded_project_files(request, jobs_list, upload_directory):
raise ValueError(f"Error downloading file from URL: {project_url}") raise ValueError(f"Error downloading file from URL: {project_url}")
elif local_path and os.path.exists(local_path): elif local_path and os.path.exists(local_path):
referred_name = os.path.basename(local_path) referred_name = os.path.basename(local_path)
else: else:
raise ValueError("Cannot find any valid project paths") raise ValueError("Cannot find any valid project paths")
# Prepare the local filepath # Prepare the local filepath
cleaned_path_name = os.path.splitext(referred_name)[0].replace(' ', '_') cleaned_path_name = jobs_list[0].get('name', os.path.splitext(referred_name)[0]).replace(' ', '-')
job_dir = os.path.join(upload_directory, '-'.join( job_dir = os.path.join(upload_directory, '-'.join(
[datetime.now().strftime("%Y.%m.%d_%H.%M.%S"), renderer, cleaned_path_name])) [datetime.now().strftime("%Y.%m.%d_%H.%M.%S"), renderer, cleaned_path_name]))
os.makedirs(job_dir, exist_ok=True) os.makedirs(job_dir, exist_ok=True)
@@ -68,7 +81,6 @@ def download_project_from_url(project_url):
# This nested function is to handle downloading from a URL # This nested function is to handle downloading from a URL
logger.info(f"Downloading project from url: {project_url}") logger.info(f"Downloading project from url: {project_url}")
referred_name = os.path.basename(project_url) referred_name = os.path.basename(project_url)
downloaded_file_url = None
try: try:
response = requests.get(project_url, stream=True) response = requests.get(project_url, stream=True)
@@ -95,7 +107,21 @@ def download_project_from_url(project_url):
def process_zipped_project(zip_path): def process_zipped_project(zip_path):
# Given a zip path, extract its content, and return the main project file path """
Processes a zipped project.
This method takes a path to a zip file, extracts its contents, and returns the path to the extracted project file.
If the zip file contains more than one project file or none, an error is raised.
Args:
zip_path (str): The path to the zip file.
Raises:
ValueError: If there's more than 1 project file or none in the zip file.
Returns:
str: The path to the main project file.
"""
work_path = os.path.dirname(zip_path) work_path = os.path.dirname(zip_path)
try: try:
@@ -122,58 +148,3 @@ def process_zipped_project(zip_path):
logger.error(f"Error processing zip file: {e}") logger.error(f"Error processing zip file: {e}")
raise ValueError(f"Error processing zip file: {e}") raise ValueError(f"Error processing zip file: {e}")
return extracted_project_path return extracted_project_path
def create_render_jobs(jobs_list, loaded_project_local_path, job_dir):
results = []
for job_data in jobs_list:
try:
# get new output path in output_dir
output_path = job_data.get('output_path')
if not output_path:
loaded_project_filename = os.path.basename(loaded_project_local_path)
output_filename = os.path.splitext(loaded_project_filename)[0]
else:
output_filename = os.path.basename(output_path)
# Prepare output path
output_dir = os.path.join(os.path.dirname(os.path.dirname(loaded_project_local_path)), 'output')
output_path = os.path.join(output_dir, output_filename)
os.makedirs(output_dir, exist_ok=True)
logger.debug(f"New job output path: {output_path}")
# create & configure jobs
worker = EngineManager.create_worker(renderer=job_data['renderer'],
input_path=loaded_project_local_path,
output_path=output_path,
engine_version=job_data.get('engine_version'),
args=job_data.get('args', {}))
worker.status = job_data.get("initial_status", worker.status)
worker.parent = job_data.get("parent", worker.parent)
worker.name = job_data.get("name", worker.name)
worker.priority = int(job_data.get('priority', worker.priority))
worker.start_frame = int(job_data.get("start_frame", worker.start_frame))
worker.end_frame = int(job_data.get("end_frame", worker.end_frame))
# determine if we can / should split the job
if job_data.get("enable_split_jobs", False) and (worker.total_frames > 1) and not worker.parent:
DistributedJobManager.split_into_subjobs(worker, job_data, loaded_project_local_path)
else:
logger.debug("Not splitting into subjobs")
RenderQueue.add_to_render_queue(worker, force_start=job_data.get('force_start', False))
if not worker.parent:
from src.api.api_server import make_job_ready
make_job_ready(worker.id)
results.append(worker.json())
except FileNotFoundError as e:
err_msg = f"Cannot create job: {e}"
logger.error(err_msg)
results.append({'error': err_msg})
except Exception as e:
err_msg = f"Exception creating render job: {e}"
logger.exception(err_msg)
results.append({'error': err_msg})
return results
+284 -299
View File
@@ -1,158 +1,96 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import concurrent.futures
import json import json
import logging import logging
import multiprocessing
import os import os
import pathlib import pathlib
import shutil import shutil
import socket import socket
import ssl import ssl
import tempfile import tempfile
import threading
import time import time
from datetime import datetime from datetime import datetime
from zipfile import ZipFile
import json2html
import psutil import psutil
import yaml import yaml
from flask import Flask, request, render_template, send_file, after_this_request, Response, redirect, url_for, abort from flask import Flask, request, send_file, after_this_request, Response, redirect, url_for
from sqlalchemy.orm.exc import DetachedInstanceError
from src.api.add_job_helpers import handle_uploaded_project_files, process_zipped_project, create_render_jobs from src.api.add_job_helpers import handle_uploaded_project_files, process_zipped_project
from src.api.serverproxy_manager import ServerProxyManager from src.api.preview_manager import PreviewManager
from src.distributed_job_manager import DistributedJobManager from src.distributed_job_manager import DistributedJobManager
from src.engines.core.base_worker import string_to_status, RenderStatus
from src.engines.engine_manager import EngineManager from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue, JobNotFoundError from src.render_queue import RenderQueue, JobNotFoundError
from src.utilities.config import Config from src.utilities.config import Config
from src.utilities.misc_helper import system_safe_path, current_system_os, current_system_cpu, \ from src.utilities.misc_helper import system_safe_path, current_system_os, current_system_cpu, \
current_system_os_version, config_dir current_system_os_version, num_to_alphanumeric
from src.utilities.server_helper import generate_thumbnail_for_job from src.utilities.status_utils import string_to_status
from src.utilities.zeroconf_server import ZeroconfServer
logger = logging.getLogger() logger = logging.getLogger()
server = Flask(__name__, template_folder='web/templates', static_folder='web/static') server = Flask(__name__)
ssl._create_default_https_context = ssl._create_unverified_context # disable SSL for downloads ssl._create_default_https_context = ssl._create_unverified_context # disable SSL for downloads
categories = [RenderStatus.RUNNING, RenderStatus.ERROR, RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED,
RenderStatus.COMPLETED, RenderStatus.CANCELLED] def start_server(hostname=None):
# get hostname
if not hostname:
local_hostname = socket.gethostname()
hostname = local_hostname + (".local" if not local_hostname.endswith(".local") else "")
# load flask settings
server.config['HOSTNAME'] = hostname
server.config['PORT'] = int(Config.port_number)
server.config['UPLOAD_FOLDER'] = system_safe_path(os.path.expanduser(Config.upload_folder))
server.config['MAX_CONTENT_PATH'] = Config.max_content_path
server.config['enable_split_jobs'] = Config.enable_split_jobs
# disable most Flask logging
flask_log = logging.getLogger('werkzeug')
flask_log.setLevel(Config.flask_log_level.upper())
logger.debug('Starting API server')
try:
server.run(host=hostname, port=server.config['PORT'], debug=Config.flask_debug_enable, use_reloader=False,
threaded=True)
finally:
logger.debug('Stopping API server')
def sorted_jobs(all_jobs, sort_by_date=True): # --------------------------------------------
if not sort_by_date: # Get All Jobs
sorted_job_list = [] # --------------------------------------------
if all_jobs:
for status_category in categories:
found_jobs = [x for x in all_jobs if x.status == status_category.value]
if found_jobs:
sorted_found_jobs = sorted(found_jobs, key=lambda d: d.date_created, reverse=True)
sorted_job_list.extend(sorted_found_jobs)
else:
sorted_job_list = sorted(all_jobs, key=lambda d: d.date_created, reverse=True)
return sorted_job_list
@server.route('/')
@server.route('/index')
def index():
with open(system_safe_path(os.path.join(config_dir(), 'presets.yaml'))) as f:
render_presets = yaml.load(f, Loader=yaml.FullLoader)
return render_template('index.html', all_jobs=sorted_jobs(RenderQueue.all_jobs()),
hostname=server.config['HOSTNAME'], renderer_info=renderer_info(),
render_clients=[server.config['HOSTNAME']], preset_list=render_presets)
@server.get('/api/jobs') @server.get('/api/jobs')
def jobs_json(): def jobs_json():
try: """Retrieves all jobs from the render queue in JSON format.
hash_token = request.args.get('token', None)
This endpoint fetches all jobs currently in the render queue, converts them to JSON format,
and returns them along with a cache token that represents the current state of the job list.
Returns:
dict: A dictionary containing:
- 'jobs' (list[dict]): A list of job dictionaries, each representing a job in the queue.
- 'token' (str): A cache token generated from the hash of the job list.
"""
all_jobs = [x.json() for x in RenderQueue.all_jobs()] all_jobs = [x.json() for x in RenderQueue.all_jobs()]
job_cache_token = str(json.dumps(all_jobs).__hash__()) job_cache_int = int(json.dumps(all_jobs).__hash__())
job_cache_token = num_to_alphanumeric(job_cache_int)
if hash_token and hash_token == job_cache_token:
return [], 204 # no need to update
else:
return {'jobs': all_jobs, 'token': job_cache_token} return {'jobs': all_jobs, 'token': job_cache_token}
except Exception as e:
logger.exception(f"Exception fetching all_jobs_cached: {e}")
return [], 500
@server.route('/ui/job/<job_id>/full_details') @server.get('/api/jobs_long_poll')
def job_detail(job_id): def long_polling_jobs():
found_job = RenderQueue.job_with_id(job_id) hash_token = request.args.get('token', None)
table_html = json2html.json2html.convert(json=found_job.json(), start_time = time.time()
table_attributes='class="table is-narrow is-striped is-fullwidth"') while True:
media_url = None all_jobs = jobs_json()
if found_job.file_list() and found_job.status == RenderStatus.COMPLETED: if all_jobs['token'] != hash_token:
media_basename = os.path.basename(found_job.file_list()[0]) return all_jobs
media_url = f"/api/job/{job_id}/file/{media_basename}" # Break after 30 seconds to avoid gateway timeout
return render_template('details.html', detail_table=table_html, media_url=media_url, if time.time() - start_time > 30:
hostname=server.config['HOSTNAME'], job_status=found_job.status.value.title(), return {}, 204
job=found_job, renderer_info=renderer_info()) time.sleep(1)
@server.route('/api/job/<job_id>/thumbnail')
def job_thumbnail(job_id):
big_thumb = request.args.get('size', False) == "big"
video_ok = request.args.get('video_ok', False)
found_job = RenderQueue.job_with_id(job_id, none_ok=True)
if found_job:
os.makedirs(server.config['THUMBS_FOLDER'], exist_ok=True)
thumb_video_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '.mp4')
thumb_image_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '.jpg')
big_video_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '_big.mp4')
big_image_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '_big.jpg')
# generate regular thumb if it doesn't exist
if not os.path.exists(thumb_video_path) and not os.path.exists(thumb_video_path + '_IN-PROGRESS') and \
found_job.status not in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
generate_thumbnail_for_job(found_job, thumb_video_path, thumb_image_path, max_width=240)
# generate big thumb if it doesn't exist
if not os.path.exists(big_video_path) and not os.path.exists(big_image_path + '_IN-PROGRESS') and \
found_job.status not in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
generate_thumbnail_for_job(found_job, big_video_path, big_image_path, max_width=800)
# generated videos
if video_ok:
if big_thumb and os.path.exists(big_video_path) and not os.path.exists(
big_video_path + '_IN-PROGRESS'):
return send_file(big_video_path, mimetype="video/mp4")
elif os.path.exists(thumb_video_path) and not os.path.exists(thumb_video_path + '_IN-PROGRESS'):
return send_file(thumb_video_path, mimetype="video/mp4")
# Generated thumbs
if big_thumb and os.path.exists(big_image_path):
return send_file(big_image_path, mimetype='image/jpeg')
elif os.path.exists(thumb_image_path):
return send_file(thumb_image_path, mimetype='image/jpeg')
# Misc status icons
if found_job.status == RenderStatus.RUNNING:
return send_file('../web/static/images/gears.png', mimetype="image/png")
elif found_job.status == RenderStatus.CANCELLED:
return send_file('../web/static/images/cancelled.png', mimetype="image/png")
elif found_job.status == RenderStatus.SCHEDULED:
return send_file('../web/static/images/scheduled.png', mimetype="image/png")
elif found_job.status == RenderStatus.NOT_STARTED:
return send_file('../web/static/images/not_started.png', mimetype="image/png")
# errors
return send_file('../web/static/images/error.png', mimetype="image/png")
# Get job file routing
@server.route('/api/job/<job_id>/file/<filename>', methods=['GET'])
def get_job_file(job_id, filename):
found_job = RenderQueue.job_with_id(job_id)
try:
for full_path in found_job.file_list():
if filename in full_path:
return send_file(path_or_file=full_path)
except FileNotFoundError:
abort(404)
@server.get('/api/jobs/<status_val>') @server.get('/api/jobs/<status_val>')
@@ -165,29 +103,33 @@ def filtered_jobs_json(status_val):
return f'Cannot find jobs with status {status_val}', 400 return f'Cannot find jobs with status {status_val}', 400
@server.post('/api/job/<job_id>/notify_parent_of_status_change') # --------------------------------------------
def subjob_status_change(job_id): # Job Details / File Handling
try: # --------------------------------------------
subjob_details = request.json
logger.info(f"Subjob to job id: {job_id} is now {subjob_details['status']}")
DistributedJobManager.handle_subjob_status_change(RenderQueue.job_with_id(job_id), subjob_data=subjob_details)
return Response(status=200)
except JobNotFoundError:
return "Job not found", 404
@server.errorhandler(JobNotFoundError)
def handle_job_not_found(job_error):
return f'Cannot find job with ID {job_error.job_id}', 400
@server.get('/api/job/<job_id>') @server.get('/api/job/<job_id>')
def get_job_status(job_id): def get_job_details(job_id):
"""Retrieves the details of a requested job in JSON format
Args:
job_id (str): The ID of the render job.
Returns:
dict: A JSON representation of the job's details.
"""
return RenderQueue.job_with_id(job_id).json() return RenderQueue.job_with_id(job_id).json()
@server.get('/api/job/<job_id>/logs') @server.get('/api/job/<job_id>/logs')
def get_job_logs(job_id): def get_job_logs(job_id):
"""Retrieves the log file for a specific render job.
Args:
job_id (str): The ID of the render job.
Returns:
Response: The log file's content as plain text, or an empty response if the log file is not found.
"""
found_job = RenderQueue.job_with_id(job_id) found_job = RenderQueue.job_with_id(job_id)
log_path = system_safe_path(found_job.log_path()) log_path = system_safe_path(found_job.log_path())
log_data = None log_data = None
@@ -199,40 +141,41 @@ def get_job_logs(job_id):
@server.get('/api/job/<job_id>/file_list') @server.get('/api/job/<job_id>/file_list')
def get_file_list(job_id): def get_file_list(job_id):
return RenderQueue.job_with_id(job_id).file_list() return [os.path.basename(x) for x in RenderQueue.job_with_id(job_id).file_list()]
@server.get('/api/job/<job_id>/make_ready') @server.route('/api/job/<job_id>/download')
def make_job_ready(job_id): def download_requested_file(job_id):
try:
requested_filename = request.args.get('filename')
if not requested_filename:
return 'Filename required', 400
found_job = RenderQueue.job_with_id(job_id) found_job = RenderQueue.job_with_id(job_id)
if found_job.status in [RenderStatus.CONFIGURING, RenderStatus.NOT_STARTED]: for job_filename in found_job.file_list():
if found_job.children: if os.path.basename(job_filename).lower() == requested_filename.lower():
for child_key in found_job.children.keys(): return send_file(job_filename, as_attachment=True, )
child_id = child_key.split('@')[0]
hostname = child_key.split('@')[-1] return f"File '{requested_filename}' not found", 404
ServerProxyManager.get_proxy_for_hostname(hostname).request_data(f'job/{child_id}/make_ready')
found_job.status = RenderStatus.NOT_STARTED
RenderQueue.save_state()
return found_job.json(), 200
except Exception as e:
return "Error making job ready: {e}", 500
return "Not valid command", 405
@server.route('/api/job/<job_id>/download_all') @server.route('/api/job/<job_id>/download_all')
def download_all(job_id): def download_all_files(job_id):
zip_filename = None zip_filename = None
@after_this_request @after_this_request
def clear_zip(response): def clear_zip(response):
if zip_filename and os.path.exists(zip_filename): if zip_filename and os.path.exists(zip_filename):
try:
os.remove(zip_filename) os.remove(zip_filename)
except Exception as e:
logger.warning(f"Error removing zip file '{zip_filename}': {e}")
return response return response
found_job = RenderQueue.job_with_id(job_id) found_job = RenderQueue.job_with_id(job_id)
output_dir = os.path.dirname(found_job.output_path) output_dir = os.path.dirname(found_job.output_path)
if os.path.exists(output_dir): if os.path.exists(output_dir):
from zipfile import ZipFile
zip_filename = system_safe_path(os.path.join(tempfile.gettempdir(), zip_filename = system_safe_path(os.path.join(tempfile.gettempdir(),
pathlib.Path(found_job.input_path).stem + '.zip')) pathlib.Path(found_job.input_path).stem + '.zip'))
with ZipFile(zip_filename, 'w') as zipObj: with ZipFile(zip_filename, 'w') as zipObj:
@@ -244,12 +187,16 @@ def download_all(job_id):
return f'Cannot find project files for job {job_id}', 500 return f'Cannot find project files for job {job_id}', 500
# --------------------------------------------
# System Environment / Status
# --------------------------------------------
@server.get('/api/presets') @server.get('/api/presets')
def presets(): def presets():
presets_path = system_safe_path('config/presets.yaml') presets_path = system_safe_path('config/presets.yaml')
with open(presets_path) as f: with open(presets_path) as f:
presets = yaml.load(f, Loader=yaml.FullLoader) loaded_presets = yaml.load(f, Loader=yaml.FullLoader)
return presets return loaded_presets
@server.get('/api/full_status') @server.get('/api/full_status')
@@ -275,13 +222,28 @@ def snapshot():
return server_data return server_data
@server.get('/api/_detected_clients') @server.route('/api/status')
def detected_clients(): def status():
# todo: dev/debug only. Should not ship this - probably. return {"timestamp": datetime.now().isoformat(),
return ZeroconfServer.found_hostnames() "system_os": current_system_os(),
"system_os_version": current_system_os_version(),
"system_cpu": current_system_cpu(),
"cpu_percent": psutil.cpu_percent(percpu=False),
"cpu_percent_per_cpu": psutil.cpu_percent(percpu=True),
"cpu_count": psutil.cpu_count(logical=False),
"memory_total": psutil.virtual_memory().total,
"memory_available": psutil.virtual_memory().available,
"memory_percent": psutil.virtual_memory().percent,
"job_counts": RenderQueue.job_counts(),
"hostname": server.config['HOSTNAME'],
"port": server.config['PORT']
}
# New version # --------------------------------------------
# Job Lifecyle (Create, Cancel, Delete)
# --------------------------------------------
@server.post('/api/add_job') @server.post('/api/add_job')
def add_job_handler(): def add_job_handler():
# Process request data # Process request data
@@ -291,18 +253,7 @@ def add_job_handler():
elif request.form.get('json', None): elif request.form.get('json', None):
jobs_list = json.loads(request.form['json']) jobs_list = json.loads(request.form['json'])
else: else:
# Cleanup flat form data into nested structure return "Invalid data", 400
form_dict = {k: v for k, v in dict(request.form).items() if v}
args = {}
arg_keys = [k for k in form_dict.keys() if '-arg_' in k]
for server_hostname in arg_keys:
if form_dict['renderer'] in server_hostname or 'AnyRenderer' in server_hostname:
cleaned_key = server_hostname.split('-arg_')[-1]
args[cleaned_key] = form_dict[server_hostname]
form_dict.pop(server_hostname)
args['raw'] = form_dict.get('raw_args', None)
form_dict['args'] = args
jobs_list = [form_dict]
except Exception as e: except Exception as e:
err_msg = f"Error processing job data: {e}" err_msg = f"Error processing job data: {e}"
logger.error(err_msg) logger.error(err_msg)
@@ -314,16 +265,13 @@ def add_job_handler():
if loaded_project_local_path.lower().endswith('.zip'): if loaded_project_local_path.lower().endswith('.zip'):
loaded_project_local_path = process_zipped_project(loaded_project_local_path) loaded_project_local_path = process_zipped_project(loaded_project_local_path)
results = create_render_jobs(jobs_list, loaded_project_local_path, referred_name) results = []
for response in results: for new_job_data in jobs_list:
if response.get('error', None): new_job = DistributedJobManager.create_render_job(new_job_data, loaded_project_local_path)
return results, 400 results.append(new_job.json())
if request.args.get('redirect', False):
return redirect(url_for('index'))
else:
return results, 200 return results, 200
except Exception as e: except Exception as e:
logger.exception(f"Unknown error adding job: {e}") logger.exception(f"Error adding job: {e}")
return 'unknown error', 500 return 'unknown error', 500
@@ -349,87 +297,94 @@ def delete_job(job_id):
# Check if we can remove the 'output' directory # Check if we can remove the 'output' directory
found_job = RenderQueue.job_with_id(job_id) found_job = RenderQueue.job_with_id(job_id)
project_dir = os.path.dirname(os.path.dirname(found_job.input_path))
output_dir = os.path.dirname(found_job.output_path) output_dir = os.path.dirname(found_job.output_path)
found_job.stop()
try:
PreviewManager.delete_previews_for_job(found_job)
except Exception as e:
logger.error(f"Error deleting previews for {found_job}: {e}")
# finally delete the job
RenderQueue.delete_job(found_job)
# delete the output_dir
if server.config['UPLOAD_FOLDER'] in output_dir and os.path.exists(output_dir): if server.config['UPLOAD_FOLDER'] in output_dir and os.path.exists(output_dir):
shutil.rmtree(output_dir) shutil.rmtree(output_dir)
# Remove any thumbnails # See if we own the project_dir (i.e. was it uploaded) - if so delete the directory
for filename in os.listdir(server.config['THUMBS_FOLDER']): try:
if job_id in filename:
os.remove(os.path.join(server.config['THUMBS_FOLDER'], filename))
thumb_path = os.path.join(server.config['THUMBS_FOLDER'], found_job.id + '.mp4')
if os.path.exists(thumb_path):
os.remove(thumb_path)
# See if we own the project_dir (i.e. was it uploaded)
project_dir = os.path.dirname(os.path.dirname(found_job.input_path))
if server.config['UPLOAD_FOLDER'] in project_dir and os.path.exists(project_dir): if server.config['UPLOAD_FOLDER'] in project_dir and os.path.exists(project_dir):
# check to see if any other projects are sharing the same project file # check to see if any other projects are sharing the same project file
project_dir_files = [f for f in os.listdir(project_dir) if not f.startswith('.')] project_dir_files = [f for f in os.listdir(project_dir) if not f.startswith('.')]
if len(project_dir_files) == 0 or (len(project_dir_files) == 1 and 'source' in project_dir_files[0]): if len(project_dir_files) == 0 or (len(project_dir_files) == 1 and 'source' in project_dir_files[0]):
logger.info(f"Removing project directory: {project_dir}") logger.info(f"Removing project directory: {project_dir}")
shutil.rmtree(project_dir) shutil.rmtree(project_dir)
except Exception as e:
logger.error(f"Error removing project files: {e}")
RenderQueue.delete_job(found_job)
if request.args.get('redirect', False):
return redirect(url_for('index'))
else:
return "Job deleted", 200 return "Job deleted", 200
except Exception as e: except Exception as e:
logger.error(f"Error deleting job: {e}") logger.error(f"Error deleting job: {e}")
return f"Error deleting job: {e}", 500 return f"Error deleting job: {e}", 500
@server.get('/api/clear_history') # --------------------------------------------
def clear_history(): # Engine Info and Management:
RenderQueue.clear_history() # --------------------------------------------
return 'success'
@server.route('/api/status')
def status():
renderer_data = {}
for render_class in EngineManager.supported_engines():
if EngineManager.all_versions_for_engine(render_class.name): # only return renderers installed on host
renderer_data[render_class.engine.name()] = \
{'versions': EngineManager.all_versions_for_engine(render_class.engine.name()),
'is_available': RenderQueue.is_available_for_job(render_class.engine.name())
}
# Get system info
return {"timestamp": datetime.now().isoformat(),
"system_os": current_system_os(),
"system_os_version": current_system_os_version(),
"system_cpu": current_system_cpu(),
"cpu_percent": psutil.cpu_percent(percpu=False),
"cpu_percent_per_cpu": psutil.cpu_percent(percpu=True),
"cpu_count": psutil.cpu_count(logical=False),
"memory_total": psutil.virtual_memory().total,
"memory_available": psutil.virtual_memory().available,
"memory_percent": psutil.virtual_memory().percent,
"job_counts": RenderQueue.job_counts(),
"renderers": renderer_data,
"hostname": server.config['HOSTNAME'],
"port": server.config['PORT']
}
@server.get('/api/renderer_info') @server.get('/api/renderer_info')
def renderer_info(): def renderer_info():
renderer_data = {} response_type = request.args.get('response_type', 'standard')
for engine in EngineManager.supported_engines(): if response_type not in ['full', 'standard']:
# Get all installed versions of engine raise ValueError(f"Invalid response_type: {response_type}")
def process_engine(engine):
try:
# Get all installed versions of the engine
installed_versions = EngineManager.all_versions_for_engine(engine.name()) installed_versions = EngineManager.all_versions_for_engine(engine.name())
if installed_versions: if not installed_versions:
# fixme: using system versions only because downloaded versions may have permissions issues return None
system_installed_versions = [x for x in installed_versions if x['type'] == 'system']
system_installed_versions = [v for v in installed_versions if v['type'] == 'system']
install_path = system_installed_versions[0]['path'] if system_installed_versions else installed_versions[0]['path'] install_path = system_installed_versions[0]['path'] if system_installed_versions else installed_versions[0]['path']
renderer_data[engine.name()] = {'is_available': RenderQueue.is_available_for_job(engine.name()),
'versions': installed_versions, en = engine(install_path)
'supported_extensions': engine.supported_extensions(), engine_name = en.name()
'supported_export_formats': engine(install_path).get_output_formats()} result = {
engine_name: {
'is_available': RenderQueue.is_available_for_job(engine_name),
'versions': installed_versions
}
}
if response_type == 'full':
with concurrent.futures.ThreadPoolExecutor() as executor:
future_results = {
'supported_extensions': executor.submit(en.supported_extensions),
'supported_export_formats': executor.submit(en.get_output_formats),
'system_info': executor.submit(en.system_info)
}
for key, future in future_results.items():
result[engine_name][key] = future.result()
return result
except Exception as e:
logger.error(f'Error fetching details for {engine.name()} renderer: {e}')
raise e
renderer_data = {}
with concurrent.futures.ThreadPoolExecutor() as executor:
futures = {executor.submit(process_engine, engine): engine.name() for engine in EngineManager.supported_engines()}
for future in concurrent.futures.as_completed(futures):
result = future.result()
if result:
renderer_data.update(result)
return renderer_data return renderer_data
@@ -499,65 +454,95 @@ def get_renderer_help(renderer):
return f"Cannot find renderer '{renderer}'", 400 return f"Cannot find renderer '{renderer}'", 400
@server.route('/upload') # --------------------------------------------
def upload_file_page(): # Miscellaneous:
return render_template('upload.html', supported_renderers=EngineManager.supported_engines()) # --------------------------------------------
@server.post('/api/job/<job_id>/send_subjob_update_notification')
def subjob_update_notification(job_id):
subjob_details = request.json
DistributedJobManager.handle_subjob_update_notification(RenderQueue.job_with_id(job_id), subjob_data=subjob_details)
return Response(status=200)
def start_server(): @server.route('/api/job/<job_id>/thumbnail')
def eval_loop(delay_sec=1): def job_thumbnail(job_id):
while True:
RenderQueue.evaluate_queue()
time.sleep(delay_sec)
# get hostname
local_hostname = socket.gethostname()
local_hostname = local_hostname + (".local" if not local_hostname.endswith(".local") else "")
# load flask settings
server.config['HOSTNAME'] = local_hostname
server.config['PORT'] = int(Config.port_number)
server.config['UPLOAD_FOLDER'] = system_safe_path(os.path.expanduser(Config.upload_folder))
server.config['THUMBS_FOLDER'] = system_safe_path(os.path.join(os.path.expanduser(Config.upload_folder), 'thumbs'))
server.config['MAX_CONTENT_PATH'] = Config.max_content_path
server.config['enable_split_jobs'] = Config.enable_split_jobs
# Setup directory for saving engines to
EngineManager.engines_path = system_safe_path(os.path.join(os.path.join(os.path.expanduser(Config.upload_folder),
'engines')))
os.makedirs(EngineManager.engines_path, exist_ok=True)
# Debug info
logger.debug(f"Upload directory: {server.config['UPLOAD_FOLDER']}")
logger.debug(f"Thumbs directory: {server.config['THUMBS_FOLDER']}")
logger.debug(f"Engines directory: {EngineManager.engines_path}")
# disable most Flask logging
flask_log = logging.getLogger('werkzeug')
flask_log.setLevel(Config.flask_log_level.upper())
# check for updates for render engines if config'd or on first launch
if Config.update_engines_on_launch or not EngineManager.all_engines():
EngineManager.update_all_engines()
# Set up the RenderQueue object
RenderQueue.load_state(database_directory=server.config['UPLOAD_FOLDER'])
ServerProxyManager.subscribe_to_listener()
DistributedJobManager.subscribe_to_listener()
thread = threading.Thread(target=eval_loop, kwargs={'delay_sec': Config.queue_eval_seconds}, daemon=True)
thread.start()
logger.info(f"Starting Zordon Render Server - Hostname: '{server.config['HOSTNAME']}:'")
ZeroconfServer.configure("_zordon._tcp.local.", server.config['HOSTNAME'], server.config['PORT'])
ZeroconfServer.properties = {'system_cpu': current_system_cpu(), 'system_cpu_cores': multiprocessing.cpu_count(),
'system_os': current_system_os(),
'system_os_version': current_system_os_version()}
ZeroconfServer.start()
try: try:
server.run(host='0.0.0.0', port=server.config['PORT'], debug=Config.flask_debug_enable, big_thumb = request.args.get('size', False) == "big"
use_reloader=False, threaded=True) video_ok = request.args.get('video_ok', False)
finally: found_job = RenderQueue.job_with_id(job_id, none_ok=False)
RenderQueue.save_state()
ZeroconfServer.stop() # trigger a thumbnail update - just in case
PreviewManager.update_previews_for_job(found_job, wait_until_completion=True, timeout=60)
previews = PreviewManager.get_previews_for_job(found_job)
all_previews_list = previews.get('output', previews.get('input', []))
video_previews = [x for x in all_previews_list if x['kind'] == 'video']
image_previews = [x for x in all_previews_list if x['kind'] == 'image']
filtered_list = video_previews if video_previews and video_ok else image_previews
# todo - sort by size or other metrics here
if filtered_list:
preview_to_send = filtered_list[0]
mime_types = {'image': 'image/jpeg', 'video': 'video/mp4'}
file_mime_type = mime_types.get(preview_to_send['kind'], 'unknown')
return send_file(preview_to_send['filename'], mimetype=file_mime_type)
except Exception as e:
logger.error(f'Error getting thumbnail: {e}')
return f'Error getting thumbnail: {e}', 500
return "No thumbnail available", 404
# --------------------------------------------
# System Benchmarks:
# --------------------------------------------
@server.get('/api/cpu_benchmark')
def get_cpu_benchmark_score():
from src.utilities.benchmark import cpu_benchmark
return str(cpu_benchmark(10))
@server.get('/api/disk_benchmark')
def get_disk_benchmark():
from src.utilities.benchmark import disk_io_benchmark
results = disk_io_benchmark()
return {'write_speed': results[0], 'read_speed': results[-1]}
# --------------------------------------------
# Error Handlers:
# --------------------------------------------
@server.errorhandler(JobNotFoundError)
def handle_job_not_found(job_error):
return str(job_error), 400
@server.errorhandler(DetachedInstanceError)
def handle_detached_instance(_):
return "Unavailable", 503
@server.errorhandler(Exception)
def handle_general_error(general_error):
err_msg = f"Server error: {general_error}"
logger.error(err_msg)
return err_msg, 500
# --------------------------------------------
# Debug / Development Only:
# --------------------------------------------
@server.get('/api/_debug/detected_clients')
def detected_clients():
# todo: dev/debug only. Should not ship this - probably.
from src.utilities.zeroconf_server import ZeroconfServer
return ZeroconfServer.found_hostnames()
@server.get('/api/_debug/clear_history')
def clear_history():
RenderQueue.clear_history()
return 'success'
+113
View File
@@ -0,0 +1,113 @@
import logging
import os
import subprocess
import threading
from pathlib import Path
from src.utilities.ffmpeg_helper import generate_thumbnail, save_first_frame
logger = logging.getLogger()
supported_video_formats = ['.mp4', '.mov', '.avi', '.mpg', '.mpeg', '.mxf', '.m4v', '.mkv', '.webm']
supported_image_formats = ['.jpg', '.png', '.exr', '.tif', '.tga', '.bmp', '.webp']
class PreviewManager:
storage_path = None
_running_jobs = {}
@classmethod
def __generate_job_preview_worker(cls, job, replace_existing=False, max_width=320):
# Determine best source file to use for thumbs
job_file_list = job.file_list()
source_files = job_file_list if job_file_list else [job.input_path]
preview_label = "output" if job_file_list else "input"
# filter by type
found_image_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in supported_image_formats]
found_video_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in supported_video_formats]
# check if we even have any valid files to work from
if source_files and not found_video_files and not found_image_files:
logger.warning(f"No valid image or video files found in files from job: {job}")
return
os.makedirs(cls.storage_path, exist_ok=True)
base_path = os.path.join(cls.storage_path, f"{job.id}-{preview_label}-{max_width}")
preview_video_path = base_path + '.mp4'
preview_image_path = base_path + '.jpg'
if replace_existing:
for x in [preview_image_path, preview_video_path]:
try:
os.remove(x)
except OSError:
pass
# Generate image previews
if (found_video_files or found_image_files) and not os.path.exists(preview_image_path):
try:
path_of_source = found_image_files[-1] if found_image_files else found_video_files[-1]
logger.debug(f"Generating image preview for {path_of_source}")
save_first_frame(source_path=path_of_source, dest_path=preview_image_path, max_width=max_width)
logger.debug(f"Successfully created image preview for {path_of_source}")
except Exception as e:
logger.error(f"Error generating image preview for {job}: {e}")
# Generate video previews
if found_video_files and not os.path.exists(preview_video_path):
try:
path_of_source = found_video_files[0]
logger.debug(f"Generating video preview for {path_of_source}")
generate_thumbnail(source_path=path_of_source, dest_path=preview_video_path, max_width=max_width)
logger.debug(f"Successfully created video preview for {path_of_source}")
except subprocess.CalledProcessError as e:
logger.error(f"Error generating video preview for {job}: {e}")
@classmethod
def update_previews_for_job(cls, job, replace_existing=False, wait_until_completion=False, timeout=None):
job_thread = cls._running_jobs.get(job.id)
if job_thread and job_thread.is_alive():
logger.debug(f'Preview generation job already running for {job}')
else:
job_thread = threading.Thread(target=cls.__generate_job_preview_worker, args=(job, replace_existing,))
job_thread.start()
cls._running_jobs[job.id] = job_thread
if wait_until_completion:
job_thread.join(timeout=timeout)
@classmethod
def get_previews_for_job(cls, job):
results = {}
try:
directory_path = Path(cls.storage_path)
preview_files_for_job = [f for f in directory_path.iterdir() if f.is_file() and f.name.startswith(job.id)]
for preview_filename in preview_files_for_job:
try:
pixel_width = str(preview_filename).split('-')[-1]
preview_label = str(os.path.basename(preview_filename)).split('-')[1]
extension = os.path.splitext(preview_filename)[-1].lower()
kind = 'video' if extension in supported_video_formats else \
'image' if extension in supported_image_formats else 'unknown'
results[preview_label] = results.get(preview_label, [])
results[preview_label].append({'filename': str(preview_filename), 'width': pixel_width, 'kind': kind})
except IndexError: # ignore invalid filenames
pass
except FileNotFoundError:
pass
return results
@classmethod
def delete_previews_for_job(cls, job):
all_previews = cls.get_previews_for_job(job)
flattened_list = [item for sublist in all_previews.values() for item in sublist]
for preview in flattened_list:
try:
logger.debug(f"Removing preview: {preview['filename']}")
os.remove(preview['filename'])
except OSError as e:
logger.error(f"Error removing preview '{preview.get('filename')}': {e}")
+189 -71
View File
@@ -1,12 +1,12 @@
import json import json
import logging import logging
import os import os
import socket
import threading import threading
import time import time
import requests import requests
from requests_toolbelt.multipart import MultipartEncoder, MultipartEncoderMonitor from requests_toolbelt.multipart import MultipartEncoder, MultipartEncoderMonitor
from urllib.parse import urljoin
from src.utilities.misc_helper import is_localhost from src.utilities.misc_helper import is_localhost
from src.utilities.status_utils import RenderStatus from src.utilities.status_utils import RenderStatus
@@ -16,13 +16,18 @@ status_colors = {RenderStatus.ERROR: "red", RenderStatus.CANCELLED: 'orange1', R
RenderStatus.RUNNING: 'cyan', RenderStatus.WAITING_FOR_SUBJOBS: 'blue'} RenderStatus.RUNNING: 'cyan', RenderStatus.WAITING_FOR_SUBJOBS: 'blue'}
categories = [RenderStatus.RUNNING, RenderStatus.WAITING_FOR_SUBJOBS, RenderStatus.ERROR, RenderStatus.NOT_STARTED, categories = [RenderStatus.RUNNING, RenderStatus.WAITING_FOR_SUBJOBS, RenderStatus.ERROR, RenderStatus.NOT_STARTED,
RenderStatus.SCHEDULED, RenderStatus.COMPLETED, RenderStatus.CANCELLED, RenderStatus.UNDEFINED] RenderStatus.SCHEDULED, RenderStatus.COMPLETED, RenderStatus.CANCELLED, RenderStatus.UNDEFINED,
RenderStatus.CONFIGURING]
logger = logging.getLogger() logger = logging.getLogger()
OFFLINE_MAX = 2 OFFLINE_MAX = 4
LOOPBACK = '127.0.0.1'
class RenderServerProxy: class RenderServerProxy:
"""The ServerProxy class is responsible for interacting with a remote server.
It provides convenience methods to request data from the server and store the status of the server.
"""
def __init__(self, hostname, server_port="8080"): def __init__(self, hostname, server_port="8080"):
self.hostname = hostname self.hostname = hostname
@@ -34,6 +39,7 @@ class RenderServerProxy:
self.__background_thread = None self.__background_thread = None
self.__offline_flags = 0 self.__offline_flags = 0
self.update_cadence = 5 self.update_cadence = 5
self.is_localhost = bool(is_localhost(hostname))
# Cache some basic server info # Cache some basic server info
self.system_cpu = None self.system_cpu = None
@@ -41,6 +47,13 @@ class RenderServerProxy:
self.system_os = None self.system_os = None
self.system_os_version = None self.system_os_version = None
# --------------------------------------------
# Basics / Connection:
# --------------------------------------------
def __repr__(self):
return f"<RenderServerProxy - {self.hostname}>"
def connect(self): def connect(self):
return self.status() return self.status()
@@ -48,7 +61,7 @@ class RenderServerProxy:
if self.__update_in_background: if self.__update_in_background:
return self.__offline_flags < OFFLINE_MAX return self.__offline_flags < OFFLINE_MAX
else: else:
return self.connect() is not None return self.get_status() is not None
def status(self): def status(self):
if not self.is_online(): if not self.is_online():
@@ -56,11 +69,16 @@ class RenderServerProxy:
running_jobs = [x for x in self.__jobs_cache if x['status'] == 'running'] if self.__jobs_cache else [] running_jobs = [x for x in self.__jobs_cache if x['status'] == 'running'] if self.__jobs_cache else []
return f"{len(running_jobs)} running" if running_jobs else "Ready" return f"{len(running_jobs)} running" if running_jobs else "Ready"
# --------------------------------------------
# Requests:
# --------------------------------------------
def request_data(self, payload, timeout=5): def request_data(self, payload, timeout=5):
try: try:
req = self.request(payload, timeout) req = self.request(payload, timeout)
if req.ok and req.status_code == 200: if req.ok:
self.__offline_flags = 0 self.__offline_flags = 0
if req.status_code == 200:
return req.json() return req.json()
except json.JSONDecodeError as e: except json.JSONDecodeError as e:
logger.debug(f"JSON decode error: {e}") logger.debug(f"JSON decode error: {e}")
@@ -72,10 +90,22 @@ class RenderServerProxy:
self.__offline_flags = self.__offline_flags + 1 self.__offline_flags = self.__offline_flags + 1
except Exception as e: except Exception as e:
logger.exception(f"Uncaught exception: {e}") logger.exception(f"Uncaught exception: {e}")
# If server unexpectedly drops off the network, stop background updates
if self.__offline_flags > OFFLINE_MAX:
try:
self.stop_background_update()
except KeyError:
pass
return None return None
def request(self, payload, timeout=5): def request(self, payload, timeout=5):
return requests.get(f'http://{self.hostname}:{self.port}/api/{payload}', timeout=timeout) hostname = LOOPBACK if self.is_localhost else self.hostname
return requests.get(f'http://{hostname}:{self.port}/api/{payload}', timeout=timeout)
# --------------------------------------------
# Background Updates:
# --------------------------------------------
def start_background_update(self): def start_background_update(self):
if self.__update_in_background: if self.__update_in_background:
@@ -83,27 +113,23 @@ class RenderServerProxy:
self.__update_in_background = True self.__update_in_background = True
def thread_worker(): def thread_worker():
logger.debug(f'Starting background updates for {self.hostname}')
while self.__update_in_background: while self.__update_in_background:
self.__update_job_cache() self.__update_job_cache()
time.sleep(self.update_cadence) time.sleep(self.update_cadence)
logger.debug(f'Stopping background updates for {self.hostname}')
self.__background_thread = threading.Thread(target=thread_worker) self.__background_thread = threading.Thread(target=thread_worker)
self.__background_thread.daemon = True self.__background_thread.daemon = True
self.__background_thread.start() self.__background_thread.start()
def stop_background_update(self): def __update_job_cache(self, timeout=40, ignore_token=False):
self.__update_in_background = False
def get_job_info(self, job_id, timeout=5): if self.__offline_flags: # if we're offline, don't bother with the long poll
return self.request_data(f'job/{job_id}', timeout=timeout) ignore_token = True
def get_all_jobs(self, timeout=5, ignore_token=False): url = f'jobs_long_poll?token={self.__jobs_cache_token}' if (self.__jobs_cache_token and
if not self.__update_in_background or ignore_token: not ignore_token) else 'jobs'
self.__update_job_cache(timeout, ignore_token)
return self.__jobs_cache.copy() if self.__jobs_cache else None
def __update_job_cache(self, timeout=5, ignore_token=False):
url = f'jobs?token={self.__jobs_cache_token}' if self.__jobs_cache_token and not ignore_token else 'jobs'
status_result = self.request_data(url, timeout=timeout) status_result = self.request_data(url, timeout=timeout)
if status_result is not None: if status_result is not None:
sorted_jobs = [] sorted_jobs = []
@@ -114,9 +140,89 @@ class RenderServerProxy:
self.__jobs_cache = sorted_jobs self.__jobs_cache = sorted_jobs
self.__jobs_cache_token = status_result['token'] self.__jobs_cache_token = status_result['token']
def stop_background_update(self):
self.__update_in_background = False
# --------------------------------------------
# Get System Info:
# --------------------------------------------
def get_all_jobs(self, timeout=5, ignore_token=False):
if not self.__update_in_background or ignore_token:
self.__update_job_cache(timeout, ignore_token)
return self.__jobs_cache.copy() if self.__jobs_cache else None
def get_data(self, timeout=5): def get_data(self, timeout=5):
all_data = self.request_data('full_status', timeout=timeout) return self.request_data('full_status', timeout=timeout)
return all_data
def get_status(self):
status = self.request_data('status')
if status and not self.system_cpu:
self.system_cpu = status['system_cpu']
self.system_cpu_count = status['cpu_count']
self.system_os = status['system_os']
self.system_os_version = status['system_os_version']
return status
# --------------------------------------------
# Get Job Info:
# --------------------------------------------
def get_job_info(self, job_id, timeout=5):
return self.request_data(f'job/{job_id}', timeout=timeout)
def get_job_files_list(self, job_id):
return self.request_data(f"job/{job_id}/file_list")
# --------------------------------------------
# Job Lifecycle:
# --------------------------------------------
def post_job_to_server(self, file_path, job_list, callback=None):
"""
Posts a job to the server.
Args:
file_path (str): The path to the file to upload.
job_list (list): A list of jobs to post.
callback (function, optional): A callback function to call during the upload. Defaults to None.
Returns:
Response: The response from the server.
"""
try:
# Check if file exists
if not os.path.exists(file_path):
raise FileNotFoundError(f"File not found: {file_path}")
# Bypass uploading file if posting to localhost
if self.is_localhost:
jobs_with_path = [{'local_path': file_path, **item} for item in job_list]
job_data = json.dumps(jobs_with_path)
url = urljoin(f'http://{LOOPBACK}:{self.port}', '/api/add_job')
headers = {'Content-Type': 'application/json'}
return requests.post(url, data=job_data, headers=headers)
# Prepare the form data for remote host
with open(file_path, 'rb') as file:
encoder = MultipartEncoder({
'file': (os.path.basename(file_path), file, 'application/octet-stream'),
'json': (None, json.dumps(job_list), 'application/json'),
})
# Create a monitor that will track the upload progress
monitor = MultipartEncoderMonitor(encoder, callback) if callback else MultipartEncoderMonitor(encoder)
headers = {'Content-Type': monitor.content_type}
url = urljoin(f'http://{self.hostname}:{self.port}', '/api/add_job')
# Send the request with proper resource management
with requests.post(url, data=monitor, headers=headers) as response:
return response
except requests.ConnectionError as e:
logger.error(f"Connection error: {e}")
except Exception as e:
logger.error(f"An error occurred: {e}")
def cancel_job(self, job_id, confirm=False): def cancel_job(self, job_id, confirm=False):
return self.request_data(f'job/{job_id}/cancel?confirm={confirm}') return self.request_data(f'job/{job_id}/cancel?confirm={confirm}')
@@ -124,14 +230,24 @@ class RenderServerProxy:
def delete_job(self, job_id, confirm=False): def delete_job(self, job_id, confirm=False):
return self.request_data(f'job/{job_id}/delete?confirm={confirm}') return self.request_data(f'job/{job_id}/delete?confirm={confirm}')
def get_status(self): def send_subjob_update_notification(self, parent_id, subjob):
status = self.request_data('status') """
if not self.system_cpu: Notifies the parent job of an update in a subjob.
self.system_cpu = status['system_cpu']
self.system_cpu_count = status['cpu_count'] Args:
self.system_os = status['system_os'] parent_id (str): The ID of the parent job.
self.system_os_version = status['system_os_version'] subjob (Job): The subjob that has updated.
return status
Returns:
Response: The response from the server.
"""
hostname = LOOPBACK if self.is_localhost else self.hostname
return requests.post(f'http://{hostname}:{self.port}/api/job/{parent_id}/send_subjob_update_notification',
json=subjob.json())
# --------------------------------------------
# Renderers:
# --------------------------------------------
def is_engine_available(self, engine_name): def is_engine_available(self, engine_name):
return self.request_data(f'{engine_name}/is_available') return self.request_data(f'{engine_name}/is_available')
@@ -139,53 +255,55 @@ class RenderServerProxy:
def get_all_engines(self): def get_all_engines(self):
return self.request_data('all_engines') return self.request_data('all_engines')
def notify_parent_of_status_change(self, parent_id, subjob): def get_renderer_info(self, response_type='standard', timeout=5):
return requests.post(f'http://{self.hostname}:{self.port}/api/job/{parent_id}/notify_parent_of_status_change', """
json=subjob.json()) Fetches renderer information from the server.
def post_job_to_server(self, file_path, job_list, callback=None): Args:
response_type (str, optional): Returns standard or full version of renderer info
timeout (int, optional): The number of seconds to wait for a response from the server. Defaults to 5.
# bypass uploading file if posting to localhost Returns:
if is_localhost(self.hostname): dict: A dictionary containing the renderer information.
jobs_with_path = [{**item, "local_path": file_path} for item in job_list] """
return requests.post(f'http://{self.hostname}:{self.port}/api/add_job', data=json.dumps(jobs_with_path), all_data = self.request_data(f"renderer_info?response_type={response_type}", timeout=timeout)
headers={'Content-Type': 'application/json'})
# Prepare the form data
encoder = MultipartEncoder({
'file': (os.path.basename(file_path), open(file_path, 'rb'), 'application/octet-stream'),
'json': (None, json.dumps(job_list), 'application/json'),
})
# Create a monitor that will track the upload progress
if callback:
monitor = MultipartEncoderMonitor(encoder, callback(encoder))
else:
monitor = MultipartEncoderMonitor(encoder)
# Send the request
headers = {'Content-Type': monitor.content_type}
return requests.post(f'http://{self.hostname}:{self.port}/api/add_job', data=monitor, headers=headers)
def get_job_files(self, job_id, save_path):
url = f"http://{self.hostname}:{self.port}/api/job/{job_id}/download_all"
return self.download_file(url, filename=save_path)
@staticmethod
def download_file(url, filename):
with requests.get(url, stream=True) as r:
r.raise_for_status()
with open(filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=8192):
f.write(chunk)
return filename
# --- Renderer --- #
def get_renderer_info(self, timeout=5):
all_data = self.request_data(f'renderer_info', timeout=timeout)
return all_data return all_data
def delete_engine(self, engine, version, system_cpu=None): def delete_engine(self, engine, version, system_cpu=None):
"""
Sends a request to the server to delete a specific engine.
Args:
engine (str): The name of the engine to delete.
version (str): The version of the engine to delete.
system_cpu (str, optional): The system CPU type. Defaults to None.
Returns:
Response: The response from the server.
"""
form_data = {'engine': engine, 'version': version, 'system_cpu': system_cpu} form_data = {'engine': engine, 'version': version, 'system_cpu': system_cpu}
return requests.post(f'http://{self.hostname}:{self.port}/api/delete_engine', json=form_data) hostname = LOOPBACK if self.is_localhost else self.hostname
return requests.post(f'http://{hostname}:{self.port}/api/delete_engine', json=form_data)
# --------------------------------------------
# Download Files:
# --------------------------------------------
def download_all_job_files(self, job_id, save_path):
hostname = LOOPBACK if self.is_localhost else self.hostname
url = f"http://{hostname}:{self.port}/api/job/{job_id}/download_all"
return self.__download_file_from_url(url, output_filepath=save_path)
def download_job_file(self, job_id, job_filename, save_path):
hostname = LOOPBACK if self.is_localhost else self.hostname
url = f"http://{hostname}:{self.port}/api/job/{job_id}/download?filename={job_filename}"
return self.__download_file_from_url(url, output_filepath=save_path)
@staticmethod
def __download_file_from_url(url, output_filepath):
with requests.get(url, stream=True) as r:
r.raise_for_status()
with open(output_filepath, 'wb') as f:
for chunk in r.iter_content(chunk_size=8192):
f.write(chunk)
return output_filepath
+3 -3
View File
@@ -17,19 +17,19 @@ class ServerProxyManager:
pub.subscribe(cls.__zeroconf_state_change, 'zeroconf_state_change') pub.subscribe(cls.__zeroconf_state_change, 'zeroconf_state_change')
@classmethod @classmethod
def __zeroconf_state_change(cls, hostname, state_change, info): def __zeroconf_state_change(cls, hostname, state_change):
if state_change == ServiceStateChange.Added or state_change == ServiceStateChange.Updated: if state_change == ServiceStateChange.Added or state_change == ServiceStateChange.Updated:
cls.get_proxy_for_hostname(hostname) cls.get_proxy_for_hostname(hostname)
else: else:
cls.get_proxy_for_hostname(hostname).stop_background_update()
cls.server_proxys.pop(hostname) cls.server_proxys.pop(hostname)
@classmethod @classmethod
def get_proxy_for_hostname(cls, hostname): def get_proxy_for_hostname(cls, hostname):
found_proxy = cls.server_proxys.get(hostname) found_proxy = cls.server_proxys.get(hostname)
if not found_proxy: if hostname and not found_proxy:
new_proxy = RenderServerProxy(hostname) new_proxy = RenderServerProxy(hostname)
new_proxy.start_background_update() new_proxy.start_background_update()
cls.server_proxys[hostname] = new_proxy cls.server_proxys[hostname] = new_proxy
found_proxy = new_proxy found_proxy = new_proxy
return found_proxy return found_proxy
+239 -212
View File
@@ -1,15 +1,18 @@
import logging import logging
import os import os
import socket import socket
import threading
import time import time
import zipfile
from plyer import notification from plyer import notification
from pubsub import pub from pubsub import pub
from src.api.preview_manager import PreviewManager
from src.api.server_proxy import RenderServerProxy from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue from src.render_queue import RenderQueue
from src.utilities.misc_helper import get_file_size_human from src.utilities.config import Config
from src.utilities.server_helper import download_missing_frames_from_subjob, distribute_server_work
from src.utilities.status_utils import RenderStatus, string_to_status from src.utilities.status_utils import RenderStatus, string_to_status
from src.utilities.zeroconf_server import ZeroconfServer from src.utilities.zeroconf_server import ZeroconfServer
@@ -28,6 +31,43 @@ class DistributedJobManager:
This should be called once, typically during the initialization phase. This should be called once, typically during the initialization phase.
""" """
pub.subscribe(cls.__local_job_status_changed, 'status_change') pub.subscribe(cls.__local_job_status_changed, 'status_change')
pub.subscribe(cls.__local_job_frame_complete, 'frame_complete')
@classmethod
def __local_job_frame_complete(cls, job_id, frame_number, update_interval=5):
"""
Responds to the 'frame_complete' pubsub message for local jobs.
Args:
job_id (str): The ID of the job that has changed status.
old_status (str): The previous status of the job.
new_status (str): The new (current) status of the job.
Note: Do not call directly. Instead, call via the 'frame_complete' pubsub message.
"""
render_job = RenderQueue.job_with_id(job_id, none_ok=True)
if not render_job: # ignore jobs not in the queue
return
logger.debug(f"Job {job_id} has completed frame #{frame_number}")
replace_existing_previews = (frame_number % update_interval) == 0
cls.__job_update_shared(render_job, replace_existing_previews)
@classmethod
def __job_update_shared(cls, render_job, replace_existing_previews=False):
# update previews
PreviewManager.update_previews_for_job(job=render_job, replace_existing=replace_existing_previews)
# notify parent to allow individual frames to be copied instead of waiting until the end
if render_job.parent:
parent_id, parent_hostname = render_job.parent.split('@')[0], render_job.parent.split('@')[-1]
try:
logger.debug(f'Job {render_job.id} updating parent {parent_id}@{parent_hostname}')
RenderServerProxy(parent_hostname).send_subjob_update_notification(parent_id, render_job)
except Exception as e:
logger.error(f"Error notifying parent {parent_hostname} about update in subjob {render_job.id}: {e}")
@classmethod @classmethod
def __local_job_status_changed(cls, job_id, old_status, new_status): def __local_job_status_changed(cls, job_id, old_status, new_status):
@@ -35,7 +75,7 @@ class DistributedJobManager:
Responds to the 'status_change' pubsub message for local jobs. Responds to the 'status_change' pubsub message for local jobs.
If it's a child job, it notifies the parent job about the status change. If it's a child job, it notifies the parent job about the status change.
Parameters: Args:
job_id (str): The ID of the job that has changed status. job_id (str): The ID of the job that has changed status.
old_status (str): The previous status of the job. old_status (str): The previous status of the job.
new_status (str): The new (current) status of the job. new_status (str): The new (current) status of the job.
@@ -48,34 +88,34 @@ class DistributedJobManager:
return return
logger.debug(f"Job {job_id} status change: {old_status} -> {new_status}") logger.debug(f"Job {job_id} status change: {old_status} -> {new_status}")
if render_job.parent: # If local job is a subjob from a remote server
parent_id, hostname = render_job.parent.split('@')[0], render_job.parent.split('@')[-1]
RenderServerProxy(hostname).notify_parent_of_status_change(parent_id=parent_id, subjob=render_job)
# handle cancelling all the children cls.__job_update_shared(render_job, replace_existing_previews=(render_job.status == RenderStatus.COMPLETED))
elif render_job.children and new_status in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
# Handle children
if render_job.children:
if new_status in [RenderStatus.CANCELLED, RenderStatus.ERROR]: # Cancel children if necessary
for child in render_job.children: for child in render_job.children:
child_id, hostname = child.split('@') child_id, child_hostname = child.split('@')
RenderServerProxy(hostname).cancel_job(child_id, confirm=True) RenderServerProxy(child_hostname).cancel_job(child_id, confirm=True)
# UI Notifications # UI Notifications
try: try:
if new_status == RenderStatus.COMPLETED: if new_status == RenderStatus.COMPLETED:
logger.debug("show render complete notification") logger.debug("Show render complete notification")
notification.notify( notification.notify(
title='Render Job Complete', title='Render Job Complete',
message=f'{render_job.name} completed succesfully', message=f'{render_job.name} completed succesfully',
timeout=10 # Display time in seconds timeout=10 # Display time in seconds
) )
elif new_status == RenderStatus.ERROR: elif new_status == RenderStatus.ERROR:
logger.debug("show render complete notification") logger.debug("Show render error notification")
notification.notify( notification.notify(
title='Render Job Failed', title='Render Job Failed',
message=f'{render_job.name} failed rendering', message=f'{render_job.name} failed rendering',
timeout=10 # Display time in seconds timeout=10 # Display time in seconds
) )
elif new_status == RenderStatus.RUNNING: elif new_status == RenderStatus.RUNNING:
logger.debug("show render complete notification") logger.debug("Show render started notification")
notification.notify( notification.notify(
title='Render Job Started', title='Render Job Started',
message=f'{render_job.name} started rendering', message=f'{render_job.name} started rendering',
@@ -84,97 +124,116 @@ class DistributedJobManager:
except Exception as e: except Exception as e:
logger.debug(f"Unable to show UI notification: {e}") logger.debug(f"Unable to show UI notification: {e}")
@classmethod # --------------------------------------------
def handle_subjob_status_change(cls, local_job, subjob_data): # Create Job
""" # --------------------------------------------
Responds to a status change from a remote subjob and triggers the creation or modification of subjobs as needed.
Parameters: @classmethod
local_job (BaseRenderWorker): The local parent job worker. def create_render_job(cls, new_job_attributes, loaded_project_local_path):
subjob_data (dict): subjob data sent from remote server. """Creates render jobs. Pass in dict of job_data and the local path to the project. It creates and returns a new
render job.
Args:
new_job_attributes (dict): Dict of desired attributes for new job (frame count, renderer, output path, etc)
loaded_project_local_path (str): The local path to the loaded project.
Returns: Returns:
None worker: Created job worker
"""
# get new output path in output_dir
output_path = new_job_attributes.get('output_path')
if not output_path:
loaded_project_filename = os.path.basename(loaded_project_local_path)
output_filename = os.path.splitext(loaded_project_filename)[0]
else:
output_filename = os.path.basename(output_path)
# Prepare output path
output_dir = os.path.join(os.path.dirname(os.path.dirname(loaded_project_local_path)), 'output')
output_path = os.path.join(output_dir, output_filename)
os.makedirs(output_dir, exist_ok=True)
logger.debug(f"New job output path: {output_path}")
# create & configure jobs
worker = EngineManager.create_worker(renderer=new_job_attributes['renderer'],
input_path=loaded_project_local_path,
output_path=output_path,
engine_version=new_job_attributes.get('engine_version'),
args=new_job_attributes.get('args', {}),
parent=new_job_attributes.get('parent'),
name=new_job_attributes.get('name'))
worker.status = new_job_attributes.get("initial_status", worker.status) # todo: is this necessary?
worker.priority = int(new_job_attributes.get('priority', worker.priority))
worker.start_frame = int(new_job_attributes.get("start_frame", worker.start_frame))
worker.end_frame = int(new_job_attributes.get("end_frame", worker.end_frame))
worker.watchdog_timeout = Config.worker_process_timeout
worker.hostname = socket.gethostname()
# determine if we can / should split the job
if new_job_attributes.get("enable_split_jobs", False) and (worker.total_frames > 1) and not worker.parent:
cls.split_into_subjobs_async(worker, new_job_attributes, loaded_project_local_path)
else:
worker.status = RenderStatus.NOT_STARTED
RenderQueue.add_to_render_queue(worker, force_start=new_job_attributes.get('force_start', False))
PreviewManager.update_previews_for_job(worker)
return worker
# --------------------------------------------
# Handling Subjobs
# --------------------------------------------
@classmethod
def handle_subjob_update_notification(cls, local_job, subjob_data):
"""Responds to a notification from a remote subjob and the host requests any subsequent updates from the subjob.
Args:
local_job (BaseRenderWorker): The local parent job worker.
subjob_data (dict): Subjob data sent from the remote server.
""" """
subjob_status = string_to_status(subjob_data['status']) subjob_status = string_to_status(subjob_data['status'])
subjob_id = subjob_data['id'] subjob_id = subjob_data['id']
subjob_hostname = next((hostname.split('@')[1] for hostname in local_job.children if subjob_hostname = subjob_data['hostname']
hostname.split('@')[0] == subjob_id), None) subjob_key = f'{subjob_id}@{subjob_hostname}'
local_job.children[f'{subjob_id}@{subjob_hostname}'] = subjob_data old_status = local_job.children.get(subjob_key, {}).get('status')
local_job.children[subjob_key] = subjob_data
logname = f"{local_job.id}:{subjob_id}@{subjob_hostname}" logname = f"<Parent: {local_job.id} | Child: {subjob_key}>"
if old_status != subjob_status.value:
logger.debug(f"Subjob status changed: {logname} -> {subjob_status.value}") logger.debug(f"Subjob status changed: {logname} -> {subjob_status.value}")
# Download complete or partial render jobs download_success = download_missing_frames_from_subjob(local_job, subjob_id, subjob_hostname)
if subjob_status in [RenderStatus.COMPLETED, RenderStatus.CANCELLED, RenderStatus.ERROR] and \ if subjob_data['status'] == 'completed' and download_success:
subjob_data['file_count']: local_job.children[subjob_key]['download_status'] = 'completed'
download_result = cls.download_from_subjob(local_job, subjob_id, subjob_hostname)
if not download_result:
# todo: handle error
logger.error(f"Unable to download subjob files from {logname} with status {subjob_status.value}")
if subjob_status == RenderStatus.CANCELLED or subjob_status == RenderStatus.ERROR:
# todo: determine missing frames and schedule new job
pass
@staticmethod
def download_from_subjob(local_job, subjob_id, subjob_hostname):
"""
Downloads and extracts files from a completed subjob on a remote server.
Parameters:
local_job (BaseRenderWorker): The local parent job worker.
subjob_id (str or int): The ID of the subjob.
subjob_hostname (str): The hostname of the remote server where the subjob is located.
Returns:
bool: True if the files have been downloaded and extracted successfully, False otherwise.
"""
child_key = f'{subjob_id}@{subjob_hostname}'
logname = f"{local_job.id}:{child_key}"
zip_file_path = local_job.output_path + f'_{subjob_hostname}_{subjob_id}.zip'
# download zip file from server
try:
local_job.children[child_key]['download_status'] = 'working'
logger.info(f"Downloading completed subjob files from {subjob_hostname} to localhost")
RenderServerProxy(subjob_hostname).get_job_files(subjob_id, zip_file_path)
logger.info(f"File transfer complete for {logname} - Transferred {get_file_size_human(zip_file_path)}")
except Exception as e:
logger.exception(f"Exception downloading files from remote server: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return False
# extract zip
try:
logger.debug(f"Extracting zip file: {zip_file_path}")
extract_path = os.path.dirname(zip_file_path)
with zipfile.ZipFile(zip_file_path, 'r') as zip_ref:
zip_ref.extractall(extract_path)
logger.info(f"Successfully extracted zip to: {extract_path}")
os.remove(zip_file_path)
local_job.children[child_key]['download_status'] = 'complete'
except Exception as e:
logger.exception(f"Exception extracting zip file: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return local_job.children[child_key].get('download_status', None) == 'complete'
@classmethod @classmethod
def wait_for_subjobs(cls, local_job): def wait_for_subjobs(cls, parent_job):
logger.debug(f"Waiting for subjobs for job {local_job}") """Check the status of subjobs and waits until they are all finished. Download rendered frames from subjobs
local_job.status = RenderStatus.WAITING_FOR_SUBJOBS when they are completed.
Args:
parent_job: Worker object that has child jobs
Returns:
"""
logger.debug(f"Waiting for subjobs for job {parent_job}")
parent_job.status = RenderStatus.WAITING_FOR_SUBJOBS
statuses_to_download = [RenderStatus.CANCELLED, RenderStatus.ERROR, RenderStatus.COMPLETED] statuses_to_download = [RenderStatus.CANCELLED, RenderStatus.ERROR, RenderStatus.COMPLETED]
def subjobs_not_downloaded(): def subjobs_not_downloaded():
return {k: v for k, v in local_job.children.items() if 'download_status' not in v or return {k: v for k, v in parent_job.children.items() if 'download_status' not in v or
v['download_status'] == 'working' or v['download_status'] is None} v['download_status'] == 'working' or v['download_status'] is None}
logger.info(f'Waiting on {len(subjobs_not_downloaded())} subjobs for {local_job.id}') logger.info(f'Waiting on {len(subjobs_not_downloaded())} subjobs for {parent_job.id}')
while len(subjobs_not_downloaded()): server_delay = 10
sleep_counter = 0
while parent_job.status == RenderStatus.WAITING_FOR_SUBJOBS:
if sleep_counter % server_delay == 0: # only ping servers every x seconds
for child_key, subjob_cached_data in subjobs_not_downloaded().items(): for child_key, subjob_cached_data in subjobs_not_downloaded().items():
subjob_id = child_key.split('@')[0] subjob_id = child_key.split('@')[0]
@@ -183,174 +242,128 @@ class DistributedJobManager:
# Fetch info from server and handle failing case # Fetch info from server and handle failing case
subjob_data = RenderServerProxy(subjob_hostname).get_job_info(subjob_id) subjob_data = RenderServerProxy(subjob_hostname).get_job_info(subjob_id)
if not subjob_data: if not subjob_data:
logger.warning(f"No response from: {subjob_hostname}") logger.warning(f"No response from {subjob_hostname}")
# todo: handle timeout / missing server situations # timeout / missing server situations
parent_job.children[child_key]['download_status'] = f'error: No response from {subjob_hostname}'
continue continue
# Update parent job cache but keep the download status # Update parent job cache but keep the download status
download_status = local_job.children[child_key].get('download_status', None) download_status = parent_job.children[child_key].get('download_status', None)
local_job.children[child_key] = subjob_data parent_job.children[child_key] = subjob_data
local_job.children[child_key]['download_status'] = download_status parent_job.children[child_key]['download_status'] = download_status
status = string_to_status(subjob_data.get('status', '')) status = string_to_status(subjob_data.get('status', ''))
status_msg = f"Subjob {child_key} | {status} | " \ status_msg = f"Subjob {child_key} | {status} | " \
f"{float(subjob_data.get('percent_complete')) * 100.0}%" f"{float(subjob_data.get('percent_complete')) * 100.0}%"
logger.debug(status_msg) logger.debug(status_msg)
# Still working in another thread - keep waiting
if download_status == 'working':
continue
# Check if job is finished, but has not had files copied yet over yet # Check if job is finished, but has not had files copied yet over yet
if download_status is None and subjob_data['file_count'] and status in statuses_to_download: if download_status is None and subjob_data['file_count'] and status in statuses_to_download:
download_result = cls.download_from_subjob(local_job, subjob_id, subjob_hostname) try:
if not download_result: download_missing_frames_from_subjob(parent_job, subjob_id, subjob_hostname)
logger.error("Failed to download from subjob") parent_job.children[child_key]['download_status'] = 'complete'
# todo: error handling here except Exception as e:
logger.error(f"Error downloading missing frames from subjob: {e}")
parent_job.children[child_key]['download_status'] = 'error: {}'
# Any finished jobs not successfully downloaded at this point are skipped # Any finished jobs not successfully downloaded at this point are skipped
if local_job.children[child_key].get('download_status', None) is None and \ if parent_job.children[child_key].get('download_status', None) is None and \
status in statuses_to_download: status in statuses_to_download:
logger.warning(f"Skipping waiting on downloading from subjob: {child_key}") logger.warning(f"Skipping waiting on downloading from subjob: {child_key}")
local_job.children[child_key]['download_status'] = 'skipped' parent_job.children[child_key]['download_status'] = 'skipped'
if subjobs_not_downloaded(): if subjobs_not_downloaded():
logger.debug(f"Waiting on {len(subjobs_not_downloaded())} subjobs on " logger.debug(f"Waiting on {len(subjobs_not_downloaded())} subjobs on "
f"{', '.join(list(subjobs_not_downloaded().keys()))}") f"{', '.join(list(subjobs_not_downloaded().keys()))}")
time.sleep(5) time.sleep(1)
sleep_counter += 1
else: # exit the loop
parent_job.status = RenderStatus.RUNNING
# --------------------------------------------
# Creating Subjobs
# --------------------------------------------
@classmethod @classmethod
def split_into_subjobs(cls, worker, job_data, project_path, system_os=None): def split_into_subjobs_async(cls, parent_worker, new_job_attributes, project_path, system_os=None):
# todo: I don't love this
parent_worker.status = RenderStatus.CONFIGURING
cls.background_worker = threading.Thread(target=cls.split_into_subjobs, args=(parent_worker, new_job_attributes,
project_path, system_os))
cls.background_worker.start()
@classmethod
def split_into_subjobs(cls, parent_worker, new_job_attributes, project_path, system_os=None, specific_servers=None):
"""
Splits a job into subjobs and distributes them among available servers.
This method checks the availability of servers, distributes the work among them, and creates subjobs on each
server. If a server is the local host, it adjusts the frame range of the parent job instead of creating a
subjob.
Args:
parent_worker (Worker): The parent job what we're creating the subjobs for.
new_job_attributes (dict): Dict of desired attributes for new job (frame count, renderer, output path, etc)
project_path (str): The path to the project.
system_os (str, optional): Required OS. Default is any.
specific_servers (list, optional): List of specific servers to split work between. Defaults to all found.
"""
# Check availability # Check availability
available_servers = cls.find_available_servers(worker.renderer, system_os) available_servers = specific_servers if specific_servers else cls.find_available_servers(parent_worker.renderer,
logger.debug(f"Splitting into subjobs - Available servers: {available_servers}") system_os)
subjob_servers = cls.distribute_server_work(worker.start_frame, worker.end_frame, available_servers) # skip if theres no external servers found
local_hostname = socket.gethostname() external_servers = [x for x in available_servers if x['hostname'] != parent_worker.hostname]
if not external_servers:
parent_worker.status = RenderStatus.NOT_STARTED
return
logger.debug(f"Splitting into subjobs - Available servers: {[x['hostname'] for x in available_servers]}")
all_subjob_server_data = distribute_server_work(parent_worker.start_frame, parent_worker.end_frame, available_servers)
# Prep and submit these sub-jobs # Prep and submit these sub-jobs
logger.info(f"Job {worker.id} split plan: {subjob_servers}") logger.info(f"Job {parent_worker.id} split plan: {all_subjob_server_data}")
try: try:
for server_data in subjob_servers: for subjob_data in all_subjob_server_data:
server_hostname = server_data['hostname'] subjob_hostname = subjob_data['hostname']
if server_hostname != local_hostname: post_results = cls.__create_subjob(new_job_attributes, project_path, subjob_data, subjob_hostname,
post_results = cls.__create_subjob(job_data, local_hostname, project_path, server_data, parent_worker)
server_hostname, worker) if not post_results.ok:
if post_results.ok: ValueError(f"Failed to create subjob on {subjob_hostname}")
server_data['submission_results'] = post_results.json()[0]
else:
logger.error(f"Failed to create subjob on {server_hostname}")
break
else:
# truncate parent render_job
worker.start_frame = max(server_data['frame_range'][0], worker.start_frame)
worker.end_frame = min(server_data['frame_range'][-1], worker.end_frame)
logger.info(f"Local job now rendering from {worker.start_frame} to {worker.end_frame}")
server_data['submission_results'] = worker.json()
# check that job posts were all successful. # save child info
if not all(d.get('submission_results') is not None for d in subjob_servers): submission_results = post_results.json()[0]
raise ValueError("Failed to create all subjobs") # look into recalculating job #s and use exising jobs child_key = f"{submission_results['id']}@{subjob_hostname}"
parent_worker.children[child_key] = submission_results
# start subjobs # start subjobs
logger.debug(f"Starting {len(subjob_servers) - 1} attempted subjobs") logger.debug(f"Created {len(all_subjob_server_data)} subjobs successfully")
for server_data in subjob_servers: parent_worker.name = f"{parent_worker.name} (Parent)"
if server_data['hostname'] != local_hostname: parent_worker.status = RenderStatus.NOT_STARTED # todo: this won't work with scheduled starts
child_key = f"{server_data['submission_results']['id']}@{server_data['hostname']}"
worker.children[child_key] = server_data['submission_results']
worker.name = f"{worker.name}[{worker.start_frame}-{worker.end_frame}]"
except Exception as e: except Exception as e:
# cancel all the subjobs # cancel all the subjobs
logger.error(f"Failed to split job into subjobs: {e}") logger.error(f"Failed to split job into subjobs: {e}")
logger.debug(f"Cancelling {len(subjob_servers) - 1} attempted subjobs") logger.debug(f"Cancelling {len(all_subjob_server_data) - 1} attempted subjobs")
# [RenderServerProxy(hostname).cancel_job(results['id'], confirm=True) for hostname, results in RenderServerProxy(parent_worker.hostname).cancel_job(parent_worker.id, confirm=True)
# submission_results.items()] # todo: fix this
@staticmethod @staticmethod
def __create_subjob(job_data, local_hostname, project_path, server_data, server_hostname, worker): def __create_subjob(new_job_attributes, project_path, server_data, server_hostname, parent_worker):
subjob = job_data.copy() """Convenience method to create subjobs for a parent worker"""
subjob['name'] = f"{worker.name}[{server_data['frame_range'][0]}-{server_data['frame_range'][-1]}]" subjob = new_job_attributes.copy()
subjob['parent'] = f"{worker.id}@{local_hostname}" subjob['name'] = f"{parent_worker.name}[{server_data['frame_range'][0]}-{server_data['frame_range'][-1]}]"
subjob['parent'] = f"{parent_worker.id}@{parent_worker.hostname}"
subjob['start_frame'] = server_data['frame_range'][0] subjob['start_frame'] = server_data['frame_range'][0]
subjob['end_frame'] = server_data['frame_range'][-1] subjob['end_frame'] = server_data['frame_range'][-1]
subjob['engine_version'] = parent_worker.renderer_version
logger.debug(f"Posting subjob with frames {subjob['start_frame']}-" logger.debug(f"Posting subjob with frames {subjob['start_frame']}-"
f"{subjob['end_frame']} to {server_hostname}") f"{subjob['end_frame']} to {server_hostname}")
post_results = RenderServerProxy(server_hostname).post_job_to_server( post_results = RenderServerProxy(server_hostname).post_job_to_server(
file_path=project_path, job_list=[subjob]) file_path=project_path, job_list=[subjob])
return post_results return post_results
@staticmethod # --------------------------------------------
def distribute_server_work(start_frame, end_frame, available_servers, method='cpu_count'): # Server Handling
""" # --------------------------------------------
Splits the frame range among available servers proportionally based on their performance (CPU count).
:param start_frame: int, The start frame number of the animation to be rendered.
:param end_frame: int, The end frame number of the animation to be rendered.
:param available_servers: list, A list of available server dictionaries. Each server dictionary should include
'hostname' and 'cpu_count' keys (see find_available_servers)
:param method: str, Optional. Specifies the distribution method. Possible values are 'cpu_count' and 'equally'
:return: A list of server dictionaries where each dictionary includes the frame range and total number of frames
to be rendered by the server.
"""
# Calculate respective frames for each server
def divide_frames_by_cpu_count(frame_start, frame_end, servers):
total_frames = frame_end - frame_start + 1
total_performance = sum(server['cpu_count'] for server in servers)
frame_ranges = {}
current_frame = frame_start
allocated_frames = 0
for i, server in enumerate(servers):
if i == len(servers) - 1: # if it's the last server
# Give all remaining frames to the last server
num_frames = total_frames - allocated_frames
else:
num_frames = round((server['cpu_count'] / total_performance) * total_frames)
allocated_frames += num_frames
frame_end_for_server = current_frame + num_frames - 1
if current_frame <= frame_end_for_server:
frame_ranges[server['hostname']] = (current_frame, frame_end_for_server)
current_frame = frame_end_for_server + 1
return frame_ranges
def divide_frames_equally(frame_start, frame_end, servers):
frame_range = frame_end - frame_start + 1
frames_per_server = frame_range // len(servers)
leftover_frames = frame_range % len(servers)
frame_ranges = {}
current_start = frame_start
for i, server in enumerate(servers):
current_end = current_start + frames_per_server - 1
if leftover_frames > 0:
current_end += 1
leftover_frames -= 1
if current_start <= current_end:
frame_ranges[server['hostname']] = (current_start, current_end)
current_start = current_end + 1
return frame_ranges
if method == 'equally':
breakdown = divide_frames_equally(start_frame, end_frame, available_servers)
# elif method == 'benchmark_score': # todo: implement benchmark score
# pass
else:
breakdown = divide_frames_by_cpu_count(start_frame, end_frame, available_servers)
server_breakdown = [server for server in available_servers if breakdown.get(server['hostname']) is not None]
for server in server_breakdown:
server['frame_range'] = breakdown[server['hostname']]
server['total_frames'] = breakdown[server['hostname']][-1] - breakdown[server['hostname']][0] + 1
return server_breakdown
@staticmethod @staticmethod
def find_available_servers(engine_name, system_os=None): def find_available_servers(engine_name, system_os=None):
@@ -370,3 +383,17 @@ class DistributedJobManager:
available_servers.append(response) available_servers.append(response)
return available_servers return available_servers
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
ZeroconfServer.configure("_zordon._tcp.local.", 'testing', 8080)
ZeroconfServer.start(listen_only=True)
print("Starting Zeroconf...")
time.sleep(2)
available_servers = DistributedJobManager.find_available_servers('blender')
print(f"AVAILABLE SERVERS ({len(available_servers)}): {available_servers}")
# results = distribute_server_work(1, 100, available_servers)
# print(f"RESULTS: {results}")
ZeroconfServer.stop()
+34 -9
View File
@@ -1,5 +1,6 @@
import logging import logging
import re import re
import threading
import requests import requests
@@ -7,8 +8,7 @@ from src.engines.blender.blender_engine import Blender
from src.engines.core.base_downloader import EngineDownloader from src.engines.core.base_downloader import EngineDownloader
from src.utilities.misc_helper import current_system_os, current_system_cpu from src.utilities.misc_helper import current_system_os, current_system_cpu
# url = "https://download.blender.org/release/" url = "https://download.blender.org/release/"
url = "https://ftp.nluug.nl/pub/graphics/blender/release/" # much faster mirror for testing
logger = logging.getLogger() logger = logging.getLogger()
supported_formats = ['.zip', '.tar.xz', '.dmg'] supported_formats = ['.zip', '.tar.xz', '.dmg']
@@ -43,11 +43,13 @@ class BlenderDownloader(EngineDownloader):
response = requests.get(base_url, timeout=5) response = requests.get(base_url, timeout=5)
response.raise_for_status() response.raise_for_status()
versions_pattern = r'<a href="(?P<file>[^"]+)">blender-(?P<version>[\d\.]+)-(?P<system_os>\w+)-(?P<cpu>\w+).*</a>' versions_pattern = \
r'<a href="(?P<file>[^"]+)">blender-(?P<version>[\d\.]+)-(?P<system_os>\w+)-(?P<cpu>\w+).*</a>'
versions_data = [match.groupdict() for match in re.finditer(versions_pattern, response.text)] versions_data = [match.groupdict() for match in re.finditer(versions_pattern, response.text)]
# Filter to just the supported formats # Filter to just the supported formats
versions_data = [item for item in versions_data if any(item["file"].endswith(ext) for ext in supported_formats)] versions_data = [item for item in versions_data if any(item["file"].endswith(ext) for ext in
supported_formats)]
# Filter down OS and CPU # Filter down OS and CPU
system_os = system_os or current_system_os() system_os = system_os or current_system_os()
@@ -78,6 +80,31 @@ class BlenderDownloader(EngineDownloader):
return lts_versions return lts_versions
@classmethod
def all_versions(cls, system_os=None, cpu=None):
majors = cls.__get_major_versions()
all_versions = []
threads = []
results = [[] for _ in majors]
def thread_function(major_version, index, system_os, cpu):
results[index] = cls.__get_minor_versions(major_version, system_os, cpu)
for i, m in enumerate(majors):
thread = threading.Thread(target=thread_function, args=(m, i, system_os, cpu))
threads.append(thread)
thread.start()
# Wait for all threads to complete
for thread in threads:
thread.join()
# Extend all_versions with the results from each thread
for result in results:
all_versions.extend(result)
return all_versions
@classmethod @classmethod
def find_most_recent_version(cls, system_os=None, cpu=None, lts_only=False): def find_most_recent_version(cls, system_os=None, cpu=None, lts_only=False):
try: try:
@@ -105,9 +132,8 @@ class BlenderDownloader(EngineDownloader):
try: try:
logger.info(f"Requesting download of blender-{version}-{system_os}-{cpu}") logger.info(f"Requesting download of blender-{version}-{system_os}-{cpu}")
major_version = '.'.join(version.split('.')[:2]) major_version = '.'.join(version.split('.')[:2])
minor_versions = [x for x in cls.__get_minor_versions(major_version, system_os, cpu) if x['version'] == version] minor_versions = [x for x in cls.__get_minor_versions(major_version, system_os, cpu) if
# we get the URL instead of calculating it ourselves. May change this x['version'] == version]
cls.download_and_extract_app(remote_url=minor_versions[0]['url'], download_location=download_location, cls.download_and_extract_app(remote_url=minor_versions[0]['url'], download_location=download_location,
timeout=timeout) timeout=timeout)
except IndexError: except IndexError:
@@ -117,5 +143,4 @@ class BlenderDownloader(EngineDownloader):
if __name__ == '__main__': if __name__ == '__main__':
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s') logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
print(BlenderDownloader.__get_major_versions()) print(BlenderDownloader.find_most_recent_version())
+60 -36
View File
@@ -6,6 +6,8 @@ from src.utilities.misc_helper import system_safe_path
logger = logging.getLogger() logger = logging.getLogger()
_creationflags = subprocess.CREATE_NO_WINDOW if platform.system() == 'Windows' else 0
class Blender(BaseRenderEngine): class Blender(BaseRenderEngine):
@@ -23,7 +25,11 @@ class Blender(BaseRenderEngine):
return BlenderRenderWorker return BlenderRenderWorker
@staticmethod @staticmethod
def supported_extensions(): def ui_options(system_info):
from src.engines.blender.blender_ui import BlenderUI
return BlenderUI.get_options(system_info)
def supported_extensions(self):
return ['blend'] return ['blend']
def version(self): def version(self):
@@ -31,7 +37,8 @@ class Blender(BaseRenderEngine):
try: try:
render_path = self.renderer_path() render_path = self.renderer_path()
if render_path: if render_path:
ver_out = subprocess.check_output([render_path, '-v'], timeout=SUBPROCESS_TIMEOUT) ver_out = subprocess.check_output([render_path, '-v'], timeout=SUBPROCESS_TIMEOUT,
creationflags=_creationflags)
version = ver_out.decode('utf-8').splitlines()[0].replace('Blender', '').strip() version = ver_out.decode('utf-8').splitlines()[0].replace('Blender', '').strip()
except Exception as e: except Exception as e:
logger.error(f'Failed to get Blender version: {e}') logger.error(f'Failed to get Blender version: {e}')
@@ -46,31 +53,42 @@ class Blender(BaseRenderEngine):
if os.path.exists(project_path): if os.path.exists(project_path):
try: try:
return subprocess.run([self.renderer_path(), '-b', project_path, '--python-expr', python_expression], return subprocess.run([self.renderer_path(), '-b', project_path, '--python-expr', python_expression],
capture_output=True, timeout=timeout) capture_output=True, timeout=timeout, creationflags=_creationflags)
except Exception as e: except Exception as e:
logger.error(f"Error running python expression in blender: {e}") err_msg = f"Error running python expression in blender: {e}"
logger.error(err_msg)
raise ChildProcessError(err_msg)
else: else:
raise FileNotFoundError(f'Project file not found: {project_path}') raise FileNotFoundError(f'Project file not found: {project_path}')
def run_python_script(self, project_path, script_path, timeout=None): def run_python_script(self, script_path, project_path=None, timeout=None):
if os.path.exists(project_path) and os.path.exists(script_path):
try: if project_path and not os.path.exists(project_path):
return subprocess.run([self.renderer_path(), '-b', project_path, '--python', script_path],
capture_output=True, timeout=timeout)
except Exception as e:
logger.warning(f"Error running python script in blender: {e}")
pass
elif not os.path.exists(project_path):
raise FileNotFoundError(f'Project file not found: {project_path}') raise FileNotFoundError(f'Project file not found: {project_path}')
elif not os.path.exists(script_path): elif not os.path.exists(script_path):
raise FileNotFoundError(f'Python script not found: {script_path}') raise FileNotFoundError(f'Python script not found: {script_path}')
raise Exception("Uncaught exception")
try:
command = [self.renderer_path(), '-b', '--python', script_path]
if project_path:
command.insert(2, project_path)
result = subprocess.run(command, capture_output=True, timeout=timeout, creationflags=_creationflags)
return result
except subprocess.TimeoutExpired:
err_msg = f"Timed out after {timeout}s while running python script in blender: {script_path}"
logger.error(err_msg)
raise TimeoutError(err_msg)
except Exception as e:
err_msg = f"Error running python script in blender: {e}"
logger.error(err_msg)
raise ChildProcessError(err_msg)
def get_project_info(self, project_path, timeout=10): def get_project_info(self, project_path, timeout=10):
scene_info = {} scene_info = {}
try: try:
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'get_file_info.py') script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'get_file_info.py')
results = self.run_python_script(project_path, system_safe_path(script_path), timeout=timeout) results = self.run_python_script(project_path=project_path, script_path=system_safe_path(script_path),
timeout=timeout)
result_text = results.stdout.decode() result_text = results.stdout.decode()
for line in result_text.splitlines(): for line in result_text.splitlines():
if line.startswith('SCENE_DATA:'): if line.startswith('SCENE_DATA:'):
@@ -80,15 +98,18 @@ class Blender(BaseRenderEngine):
elif line.startswith('Error'): elif line.startswith('Error'):
logger.error(f"get_scene_info error: {line.strip()}") logger.error(f"get_scene_info error: {line.strip()}")
except Exception as e: except Exception as e:
logger.error(f'Error getting file details for .blend file: {e}') msg = f'Error getting file details for .blend file: {e}'
logger.error(msg)
raise ChildProcessError(msg)
return scene_info return scene_info
def pack_project_file(self, project_path, timeout=30): def pack_project_file(self, project_path, timeout=None):
# Credit to L0Lock for pack script - https://blender.stackexchange.com/a/243935 # Credit to L0Lock for pack script - https://blender.stackexchange.com/a/243935
try: try:
logger.info(f"Starting to pack Blender file: {project_path}") logger.info(f"Starting to pack Blender file: {project_path}")
script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'pack_project.py') script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'pack_project.py')
results = self.run_python_script(project_path, system_safe_path(script_path), timeout=timeout) results = self.run_python_script(project_path=project_path, script_path=system_safe_path(script_path),
timeout=timeout)
result_text = results.stdout.decode() result_text = results.stdout.decode()
dir_name = os.path.dirname(project_path) dir_name = os.path.dirname(project_path)
@@ -105,11 +126,13 @@ class Blender(BaseRenderEngine):
logger.info(f'Blender file packed successfully to {new_path}') logger.info(f'Blender file packed successfully to {new_path}')
return new_path return new_path
except Exception as e: except Exception as e:
logger.error(f'Error packing .blend file: {e}') msg = f'Error packing .blend file: {e}'
logger.error(msg)
raise ChildProcessError(msg)
return None return None
def get_arguments(self): def get_arguments(self):
help_text = subprocess.check_output([self.renderer_path(), '-h']).decode('utf-8') help_text = subprocess.check_output([self.renderer_path(), '-h'], creationflags=_creationflags).decode('utf-8')
lines = help_text.splitlines() lines = help_text.splitlines()
options = {} options = {}
@@ -140,31 +163,32 @@ class Blender(BaseRenderEngine):
return options return options
def get_detected_gpus(self): def system_info(self):
# no longer works on 4.0 return {'render_devices': self.get_render_devices()}
engine_output = subprocess.run([self.renderer_path(), '-E', 'help'], timeout=SUBPROCESS_TIMEOUT,
capture_output=True).stdout.decode('utf-8') def get_render_devices(self):
gpu_names = re.findall(r"DETECTED GPU: (.+)", engine_output) script_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'scripts', 'get_system_info.py')
return gpu_names results = self.run_python_script(script_path=script_path)
output = results.stdout.decode()
match = re.search(r"GPU DATA:(\[[\s\S]*\])", output)
if match:
gpu_data_json = match.group(1)
gpus_info = json.loads(gpu_data_json)
return gpus_info
else:
logger.error("GPU data not found in the output.")
def supported_render_engines(self): def supported_render_engines(self):
engine_output = subprocess.run([self.renderer_path(), '-E', 'help'], timeout=SUBPROCESS_TIMEOUT, engine_output = subprocess.run([self.renderer_path(), '-E', 'help'], timeout=SUBPROCESS_TIMEOUT,
capture_output=True).stdout.decode('utf-8').strip() capture_output=True, creationflags=_creationflags).stdout.decode('utf-8').strip()
render_engines = [x.strip() for x in engine_output.split('Blender Engine Listing:')[-1].strip().splitlines()] render_engines = [x.strip() for x in engine_output.split('Blender Engine Listing:')[-1].strip().splitlines()]
return render_engines return render_engines
# UI and setup
def get_options(self):
options = [
{'name': 'engine', 'options': self.supported_render_engines()},
]
return options
def perform_presubmission_tasks(self, project_path): def perform_presubmission_tasks(self, project_path):
packed_path = self.pack_project_file(project_path, timeout=30) packed_path = self.pack_project_file(project_path, timeout=120)
return packed_path return packed_path
if __name__ == "__main__": if __name__ == "__main__":
x = Blender.get_detected_gpus() x = Blender().get_render_devices()
print(x) print(x)
+9
View File
@@ -0,0 +1,9 @@
class BlenderUI:
@staticmethod
def get_options(system_info):
options = [
{'name': 'engine', 'options': system_info.get('engines', [])},
{'name': 'render_device', 'options': ['Any', 'GPU', 'CPU']},
]
return options
+47 -19
View File
@@ -12,13 +12,7 @@ class BlenderRenderWorker(BaseRenderWorker):
engine = Blender engine = Blender
def __init__(self, input_path, output_path, engine_path, args=None, parent=None, name=None): def __init__(self, input_path, output_path, engine_path, args=None, parent=None, name=None):
super(BlenderRenderWorker, self).__init__(input_path=input_path, output_path=output_path, super(BlenderRenderWorker, self).__init__(input_path=input_path, output_path=output_path, engine_path=engine_path, args=args, parent=parent, name=name)
engine_path=engine_path, args=args, parent=parent, name=name)
# Args
self.blender_engine = self.args.get('engine', 'BLENDER_EEVEE').upper()
self.export_format = self.args.get('export_format', None) or 'JPEG'
self.camera = self.args.get('camera', None)
# Stats # Stats
self.__frame_percent_complete = 0.0 self.__frame_percent_complete = 0.0
@@ -37,16 +31,42 @@ class BlenderRenderWorker(BaseRenderWorker):
cmd.append('-b') cmd.append('-b')
cmd.append(self.input_path) cmd.append(self.input_path)
# Python expressions # Start Python expressions - # todo: investigate splitting into separate 'setup' script
cmd.append('--python-expr') cmd.append('--python-expr')
python_exp = 'import bpy; bpy.context.scene.render.use_overwrite = False;' python_exp = 'import bpy; bpy.context.scene.render.use_overwrite = False;'
if self.camera:
python_exp = python_exp + f"bpy.context.scene.camera = bpy.data.objects['{self.camera}'];" # Setup Custom Camera
# insert any other python exp checks here custom_camera = self.args.get('camera', None)
if custom_camera:
python_exp = python_exp + f"bpy.context.scene.camera = bpy.data.objects['{custom_camera}'];"
# Set Render Device (gpu/cpu/any)
blender_engine = self.args.get('engine', 'BLENDER_EEVEE').upper()
if blender_engine == 'CYCLES':
render_device = self.args.get('render_device', 'any').lower()
if render_device not in {'any', 'gpu', 'cpu'}:
raise AttributeError(f"Invalid Cycles render device: {render_device}")
use_gpu = render_device in {'any', 'gpu'}
use_cpu = render_device in {'any', 'cpu'}
python_exp = python_exp + ("exec(\"for device in bpy.context.preferences.addons["
f"'cycles'].preferences.devices: device.use = {use_cpu} if device.type == 'CPU'"
f" else {use_gpu}\")")
# -- insert any other python exp checks / generators here --
# End Python expressions here
cmd.append(python_exp) cmd.append(python_exp)
path_without_ext = os.path.splitext(self.output_path)[0] + "_" # Export format
cmd.extend(['-E', self.blender_engine, '-o', path_without_ext, '-F', self.export_format]) export_format = self.args.get('export_format', None) or 'JPEG'
main_part, ext = os.path.splitext(self.output_path)
# Remove the extension only if it is not composed entirely of digits
path_without_ext = main_part if not ext[1:].isdigit() else self.output_path
path_without_ext += "_"
cmd.extend(['-E', blender_engine, '-o', path_without_ext, '-F', export_format])
# set frame range # set frame range
cmd.extend(['-s', self.start_frame, '-e', self.end_frame, '-a']) cmd.extend(['-s', self.start_frame, '-e', self.end_frame, '-a'])
@@ -84,22 +104,30 @@ class BlenderRenderWorker(BaseRenderWorker):
elif line.lower().startswith('error'): elif line.lower().startswith('error'):
self.log_error(line) self.log_error(line)
elif 'Saved' in line or 'Saving' in line or 'quit' in line: elif 'Saved' in line or 'Saving' in line or 'quit' in line:
match = re.match(r'Time: (.*) \(Saving', line) render_stats_match = re.match(r'Time: (.*) \(Saving', line)
if match: output_filename_match = re.match(r"Saved: .*_(\d+)\.\w+", line) # try to get frame # from filename
time_completed = match.groups()[0] if output_filename_match:
output_file_number = output_filename_match.groups()[0]
try:
self.current_frame = int(output_file_number)
self._send_frame_complete_notification()
except ValueError:
pass
elif render_stats_match:
time_completed = render_stats_match.groups()[0]
frame_count = self.current_frame - self.end_frame + self.total_frames frame_count = self.current_frame - self.end_frame + self.total_frames
logger.info(f'Frame #{self.current_frame} - ' logger.info(f'Frame #{self.current_frame} - '
f'{frame_count} of {self.total_frames} completed in {time_completed} | ' f'{frame_count} of {self.total_frames} completed in {time_completed} | '
f'Total Elapsed Time: {datetime.now() - self.start_time}') f'Total Elapsed Time: {datetime.now() - self.start_time}')
else:
logger.debug(line)
else: else:
pass pass
# if len(line.strip()): # if len(line.strip()):
# logger.debug(line.strip()) # logger.debug(line.strip())
def percent_complete(self): def percent_complete(self):
if self.total_frames <= 1: if self.status == RenderStatus.COMPLETED:
return 1
elif self.total_frames <= 1:
return self.__frame_percent_complete return self.__frame_percent_complete
else: else:
whole_frame_percent = (self.current_frame - self.start_frame) / self.total_frames whole_frame_percent = (self.current_frame - self.start_frame) / self.total_frames
@@ -0,0 +1,17 @@
import bpy
import json
# Ensure Cycles is available
bpy.context.preferences.addons['cycles'].preferences.get_devices()
# Collect the devices information
devices_info = []
for device in bpy.context.preferences.addons['cycles'].preferences.devices:
devices_info.append({
"name": device.name,
"type": device.type,
"use": device.use
})
# Print the devices information in JSON format
print("GPU DATA:" + json.dumps(devices_info))
+132 -7
View File
@@ -1,9 +1,7 @@
import logging import logging
import os import os
import shutil import shutil
import tarfile
import tempfile import tempfile
import zipfile
import requests import requests
from tqdm import tqdm from tqdm import tqdm
@@ -12,26 +10,150 @@ logger = logging.getLogger()
class EngineDownloader: class EngineDownloader:
"""A class responsible for downloading and extracting rendering engines from publicly available URLs.
Attributes:
supported_formats (list[str]): A list of file formats supported by the downloader.
"""
supported_formats = ['.zip', '.tar.xz', '.dmg'] supported_formats = ['.zip', '.tar.xz', '.dmg']
def __init__(self): def __init__(self):
pass pass
# --------------------------------------------
# Required Overrides for Subclasses:
# --------------------------------------------
@classmethod @classmethod
def find_most_recent_version(cls, system_os=None, cpu=None, lts_only=False): def find_most_recent_version(cls, system_os=None, cpu=None, lts_only=False):
raise NotImplementedError # implement this method in your engine subclass """
Finds the most recent version of the rendering engine available for download.
This method should be overridden in a subclass to implement the logic for determining
the most recent version of the rendering engine, optionally filtering by long-term
support (LTS) versions, the operating system, and CPU architecture.
Args:
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
lts_only (bool, optional): Limit the search to LTS (long-term support) versions only. Default is False.
Returns:
dict: A dict with the following keys:
- 'cpu' (str): The CPU architecture.
- 'system_os' (str): The operating system.
- 'file' (str): The filename of the version's download file.
- 'url' (str): The remote URL for downloading the version.
- 'version' (str): The version number.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"find_most_recent_version not implemented for {cls.__class__.__name__}")
@classmethod @classmethod
def version_is_available_to_download(cls, version, system_os=None, cpu=None): def version_is_available_to_download(cls, version, system_os=None, cpu=None):
raise NotImplementedError # implement this method in your engine subclass """Checks if a requested version of the rendering engine is available for download.
This method should be overridden in a subclass to implement the logic for determining
whether a given version of the rendering engine is available for download, based on the
operating system and CPU architecture.
Args:
version (str): The requested renderer version to download.
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
Returns:
bool: True if the version is available for download, False otherwise.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"version_is_available_to_download not implemented for {cls.__class__.__name__}")
@classmethod @classmethod
def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120): def download_engine(cls, version, download_location, system_os=None, cpu=None, timeout=120):
raise NotImplementedError # implement this method in your engine subclass """Downloads the requested version of the rendering engine to the given download location.
This method should be overridden in a subclass to implement the logic for downloading
a specific version of the rendering engine. The method is intended to handle the
downloading process based on the version, operating system, CPU architecture, and
timeout parameters.
Args:
version (str): The requested renderer version to download.
download_location (str): The directory where the engine should be downloaded.
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
timeout (int, optional): The maximum time in seconds to wait for the download. Default is 120 seconds.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"download_engine not implemented for {cls.__class__.__name__}")
# --------------------------------------------
# Optional Overrides for Subclasses:
# --------------------------------------------
@classmethod
def all_versions(cls, system_os=None, cpu=None):
"""Retrieves a list of available versions of the software for a specific operating system and CPU architecture.
This method fetches all available versions for the given operating system and CPU type, constructing
a list of dictionaries containing details such as the version, CPU architecture, system OS, and the
remote URL for downloading each version.
Args:
system_os (str, optional): Desired OS ('linux', 'macos', 'windows'). Defaults to system os.
cpu (str, optional): The CPU architecture for which to download the engine. Default is system cpu.
Returns:
list[dict]: A list of dictionaries, each containing:
- 'cpu' (str): The CPU architecture.
- 'file' (str): The filename of the version's download file.
- 'system_os' (str): The operating system.
- 'url' (str): The remote URL for downloading the version.
- 'version' (str): The version number.
"""
return []
# --------------------------------------------
# Do Not Override These Methods:
# --------------------------------------------
@classmethod @classmethod
def download_and_extract_app(cls, remote_url, download_location, timeout=120): def download_and_extract_app(cls, remote_url, download_location, timeout=120):
"""Downloads an application from the given remote URL and extracts it to the specified location.
This method handles the downloading of the application, supports multiple archive formats,
and extracts the contents to the specified `download_location`. It also manages temporary
files and logs progress throughout the process.
Args:
remote_url (str): The URL of the application to download.
download_location (str): The directory where the application should be extracted.
timeout (int, optional): The maximum time in seconds to wait for the download. Default is 120 seconds.
Returns:
str: The path to the directory where the application was extracted.
Raises:
Exception: Catches and logs any exceptions that occur during the download or extraction process.
Supported Formats:
- `.tar.xz`: Extracted using the `tarfile` module.
- `.zip`: Extracted using the `zipfile` module.
- `.dmg`: macOS disk image files, handled using the `dmglib` library.
- Other formats will result in an error being logged.
Notes:
- If the application already exists in the `download_location`, the method will log an error
and return without downloading or extracting.
- Temporary files created during the download process are cleaned up after completion.
"""
# Create a temp download directory # Create a temp download directory
temp_download_dir = tempfile.mkdtemp() temp_download_dir = tempfile.mkdtemp()
@@ -80,6 +202,7 @@ class EngineDownloader:
# Extract the downloaded file # Extract the downloaded file
# Process .tar.xz files # Process .tar.xz files
if temp_downloaded_file_path.lower().endswith('.tar.xz'): if temp_downloaded_file_path.lower().endswith('.tar.xz'):
import tarfile
try: try:
with tarfile.open(temp_downloaded_file_path, 'r:xz') as tar: with tarfile.open(temp_downloaded_file_path, 'r:xz') as tar:
tar.extractall(path=download_location) tar.extractall(path=download_location)
@@ -93,12 +216,13 @@ class EngineDownloader:
# Process .zip files # Process .zip files
elif temp_downloaded_file_path.lower().endswith('.zip'): elif temp_downloaded_file_path.lower().endswith('.zip'):
import zipfile
try: try:
with zipfile.ZipFile(temp_downloaded_file_path, 'r') as zip_ref: with zipfile.ZipFile(temp_downloaded_file_path, 'r') as zip_ref:
zip_ref.extractall(download_location) zip_ref.extractall(download_location)
logger.info( logger.info(
f'Successfully extracted {os.path.basename(temp_downloaded_file_path)} to {download_location}') f'Successfully extracted {os.path.basename(temp_downloaded_file_path)} to {download_location}')
except zipfile.BadZipFile as e: except zipfile.BadZipFile:
logger.error(f'Error: {temp_downloaded_file_path} is not a valid ZIP file.') logger.error(f'Error: {temp_downloaded_file_path} is not a valid ZIP file.')
except FileNotFoundError: except FileNotFoundError:
logger.error(f'File not found: {temp_downloaded_file_path}') logger.error(f'File not found: {temp_downloaded_file_path}')
@@ -110,7 +234,8 @@ class EngineDownloader:
for mount_point in dmg.attach(): for mount_point in dmg.attach():
try: try:
copy_directory_contents(mount_point, os.path.join(download_location, output_dir_name)) copy_directory_contents(mount_point, os.path.join(download_location, output_dir_name))
logger.info(f'Successfully copied {os.path.basename(temp_downloaded_file_path)} to {download_location}') logger.info(f'Successfully copied {os.path.basename(temp_downloaded_file_path)} '
f'to {download_location}')
except FileNotFoundError: except FileNotFoundError:
logger.error(f'Error: The source .app bundle does not exist.') logger.error(f'Error: The source .app bundle does not exist.')
except PermissionError: except PermissionError:
+128 -40
View File
@@ -1,5 +1,6 @@
import logging import logging
import os import os
import platform
import subprocess import subprocess
logger = logging.getLogger() logger = logging.getLogger()
@@ -7,14 +8,138 @@ SUBPROCESS_TIMEOUT = 5
class BaseRenderEngine(object): class BaseRenderEngine(object):
"""Base class for render engines. This class provides common functionality and structure for various rendering
engines. Create subclasses and override the methods marked below to add additional renderers
Attributes:
install_paths (list): A list of default installation paths where the render engine
might be found. This list can be populated with common paths to help locate the
executable on different operating systems or environments.
"""
install_paths = [] install_paths = []
supported_extensions = []
# --------------------------------------------
# Required Overrides for Subclasses:
# --------------------------------------------
def __init__(self, custom_path=None): def __init__(self, custom_path=None):
self.custom_renderer_path = custom_path self.custom_renderer_path = custom_path
if not self.renderer_path(): if not self.renderer_path() or not os.path.exists(self.renderer_path()):
raise FileNotFoundError(f"Cannot find path to renderer for {self.name()} instance") raise FileNotFoundError(f"Cannot find path to renderer for {self.name()} instance: {self.renderer_path()}")
if not os.access(self.renderer_path(), os.X_OK):
logger.warning(f"Path is not executable. Setting permissions to 755 for {self.renderer_path()}")
os.chmod(self.renderer_path(), 0o755)
def version(self):
"""Return the version number as a string.
Returns:
str: Version number.
Raises:
NotImplementedError: If not overridden.
"""
raise NotImplementedError(f"version not implemented for {self.__class__.__name__}")
def get_project_info(self, project_path, timeout=10):
"""Extracts detailed project information from the given project path.
Args:
project_path (str): The path to the project file.
timeout (int, optional): The maximum time (in seconds) to wait for the operation. Default is 10 seconds.
Returns:
dict: A dictionary containing project information (subclasses should define the structure).
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"get_project_info not implemented for {self.__class__.__name__}")
@classmethod
def get_output_formats(cls):
"""Returns a list of available output formats supported by the renderer.
Returns:
list[str]: A list of strings representing the available output formats.
"""
raise NotImplementedError(f"get_output_formats not implemented for {cls.__name__}")
@staticmethod
def worker_class(): # override when subclassing to link worker class
raise NotImplementedError("Worker class not implemented")
# --------------------------------------------
# Optional Overrides for Subclasses:
# --------------------------------------------
def supported_extensions(self):
"""
Returns:
list[str]: list of supported extensions
"""
return []
def get_help(self):
"""Retrieves the help documentation for the renderer.
This method runs the renderer's help command (default: '-h') and captures the output.
Override this method if the renderer uses a different help flag.
Returns:
str: The help documentation as a string.
Raises:
FileNotFoundError: If the renderer path is not found.
"""
path = self.renderer_path()
if not path:
raise FileNotFoundError("renderer path not found")
creationflags = subprocess.CREATE_NO_WINDOW if platform.system() == 'Windows' else 0
help_doc = subprocess.check_output([path, '-h'], stderr=subprocess.STDOUT,
timeout=SUBPROCESS_TIMEOUT, creationflags=creationflags).decode('utf-8')
return help_doc
def system_info(self):
"""Return additional information about the system specfic to the engine (configured GPUs, render engines, etc)
Returns:
dict: A dictionary with engine-specific system information
"""
return {}
def perform_presubmission_tasks(self, project_path):
"""Perform any pre-submission tasks on a project file before uploading it to a server (pack textures, etc.)
Override this method to:
1. Copy the project file to a temporary location (DO NOT MODIFY ORIGINAL PATH).
2. Perform additional modifications or tasks.
3. Return the path to the modified project file.
Args:
project_path (str): The original project file path.
Returns:
str: The path to the modified project file.
"""
return project_path
def get_arguments(self):
pass
@staticmethod
def downloader(): # override when subclassing if using a downloader class
return None
@staticmethod
def ui_options(system_info): # override to return options for ui
return {}
# --------------------------------------------
# Do Not Override These Methods:
# --------------------------------------------
def renderer_path(self): def renderer_path(self):
return self.custom_renderer_path or self.default_renderer_path() return self.custom_renderer_path or self.default_renderer_path()
@@ -35,40 +160,3 @@ class BaseRenderEngine(object):
except Exception as e: except Exception as e:
logger.exception(e) logger.exception(e)
return path return path
def version(self):
raise NotImplementedError("version not implemented")
@staticmethod
def downloader(): # override when subclassing if using a downloader class
return None
@staticmethod
def worker_class(): # override when subclassing to link worker class
raise NotImplementedError("Worker class not implemented")
def get_help(self): # override if renderer uses different help flag
path = self.renderer_path()
if not path:
raise FileNotFoundError("renderer path not found")
help_doc = subprocess.check_output([path, '-h'], stderr=subprocess.STDOUT,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8')
return help_doc
def get_project_info(self, project_path, timeout=10):
raise NotImplementedError(f"get_project_info not implemented for {self.__name__}")
@classmethod
def get_output_formats(cls):
raise NotImplementedError(f"get_output_formats not implemented for {cls.__name__}")
@classmethod
def get_arguments(cls):
pass
def get_options(self): # override to return options for ui
return {}
def perform_presubmission_tasks(self, project_path):
return project_path
+274 -89
View File
@@ -3,14 +3,17 @@ import io
import json import json
import logging import logging
import os import os
import signal
import subprocess import subprocess
import threading import threading
import time
from datetime import datetime from datetime import datetime
import psutil import psutil
from pubsub import pub from pubsub import pub
from sqlalchemy import Column, Integer, String, DateTime, JSON from sqlalchemy import Column, Integer, String, DateTime, JSON
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.ext.mutable import MutableDict
from src.utilities.misc_helper import get_time_elapsed from src.utilities.misc_helper import get_time_elapsed
from src.utilities.status_utils import RenderStatus, string_to_status from src.utilities.status_utils import RenderStatus, string_to_status
@@ -23,6 +26,7 @@ class BaseRenderWorker(Base):
__tablename__ = 'render_workers' __tablename__ = 'render_workers'
id = Column(String, primary_key=True) id = Column(String, primary_key=True)
hostname = Column(String, nullable=True)
input_path = Column(String) input_path = Column(String)
output_path = Column(String) output_path = Column(String)
date_created = Column(DateTime) date_created = Column(DateTime)
@@ -36,13 +40,18 @@ class BaseRenderWorker(Base):
start_frame = Column(Integer) start_frame = Column(Integer)
end_frame = Column(Integer, nullable=True) end_frame = Column(Integer, nullable=True)
parent = Column(String, nullable=True) parent = Column(String, nullable=True)
children = Column(JSON) children = Column(MutableDict.as_mutable(JSON))
args = Column(MutableDict.as_mutable(JSON))
name = Column(String) name = Column(String)
file_hash = Column(String) file_hash = Column(String)
_status = Column(String) _status = Column(String)
engine = None engine = None
# --------------------------------------------
# Required Overrides for Subclasses:
# --------------------------------------------
def __init__(self, input_path, output_path, engine_path, priority=2, args=None, ignore_extensions=True, parent=None, def __init__(self, input_path, output_path, engine_path, priority=2, args=None, ignore_extensions=True, parent=None,
name=None): name=None):
@@ -52,7 +61,7 @@ class BaseRenderWorker(Base):
logger.error(err_meg) logger.error(err_meg)
raise ValueError(err_meg) raise ValueError(err_meg)
if not self.engine: if not self.engine:
raise NotImplementedError("Engine not defined") raise NotImplementedError(f"Engine not defined for {self.__class__.__name__}")
def generate_id(): def generate_id():
import uuid import uuid
@@ -60,6 +69,7 @@ class BaseRenderWorker(Base):
# Essential Info # Essential Info
self.id = generate_id() self.id = generate_id()
self.hostname = None
self.input_path = input_path self.input_path = input_path
self.output_path = output_path self.output_path = output_path
self.args = args or {} self.args = args or {}
@@ -72,11 +82,12 @@ class BaseRenderWorker(Base):
self.parent = parent self.parent = parent
self.children = {} self.children = {}
self.name = name or os.path.basename(input_path) self.name = name or os.path.basename(input_path)
self.maximum_attempts = 3
# Frame Ranges # Frame Ranges
self.project_length = -1 self.project_length = 0 # is this necessary?
self.current_frame = 0 # should this be a 1 ? self.current_frame = 0
self.start_frame = 0 # should this be a 1 ? self.start_frame = 0
self.end_frame = None self.end_frame = None
# Logging # Logging
@@ -89,10 +100,59 @@ class BaseRenderWorker(Base):
self.errors = [] self.errors = []
# Threads and processes # Threads and processes
self.__thread = threading.Thread(target=self.run, args=()) self.__thread = threading.Thread(target=self.__run, args=())
self.__thread.daemon = True self.__thread.daemon = True
self.__process = None self.__process = None
self.last_output = None self.last_output = None
self.__last_output_time = None
self.watchdog_timeout = 120
def generate_worker_subprocess(self):
"""Generate a return a list of the command line arguments necessary to perform requested job
Returns:
list[str]: list of command line arguments
"""
raise NotImplementedError("generate_worker_subprocess not implemented")
def _parse_stdout(self, line):
"""Parses a line of standard output from the renderer.
This method should be overridden in a subclass to implement the logic for processing
and interpreting a single line of output from the renderer's standard output stream.
On frame completion, the subclass should:
1. Update value of self.current_frame
2. Call self._send_frame_complete_notification()
Args:
line (str): A line of text from the renderer's standard output.
Raises:
NotImplementedError: If the method is not overridden in a subclass.
"""
raise NotImplementedError(f"_parse_stdout not implemented for {self.__class__.__name__}")
# --------------------------------------------
# Optional Overrides for Subclasses:
# --------------------------------------------
def percent_complete(self):
# todo: fix this
if self.status == RenderStatus.COMPLETED:
return 1.0
return 0
def post_processing(self):
"""Override to perform any engine-specific postprocessing"""
pass
# --------------------------------------------
# Do Not Override These Methods:
# --------------------------------------------
def __repr__(self):
return f"<Job id:{self.id} p{self.priority} {self.renderer}-{self.renderer_version} '{self.name}' status:{self.status.value}>"
@property @property
def total_frames(self): def total_frames(self):
@@ -116,33 +176,27 @@ class BaseRenderWorker(Base):
self._status = RenderStatus.CANCELLED.value self._status = RenderStatus.CANCELLED.value
return string_to_status(self._status) return string_to_status(self._status)
def validate(self): def _send_frame_complete_notification(self):
if not os.path.exists(self.input_path): pub.sendMessage('frame_complete', job_id=self.id, frame_number=self.current_frame)
raise FileNotFoundError(f"Cannot find input path: {self.input_path}")
self.generate_subprocess()
def generate_subprocess(self): def generate_subprocess(self):
# Convert raw args from string if available and catch conflicts # Convert raw args from string if available and catch conflicts
generated_args = [str(x) for x in self.generate_worker_subprocess()] generated_args = [str(x) for x in self.generate_worker_subprocess()]
generated_args_flags = [x for x in generated_args if x.startswith('-')] generated_args_flags = [x for x in generated_args if x.startswith('-')]
if len(generated_args_flags) != len(set(generated_args_flags)): if len(generated_args_flags) != len(set(generated_args_flags)):
msg = "Cannot generate subprocess - Multiple arg conflicts detected" msg = f"Cannot generate subprocess - Multiple arg conflicts detected: {generated_args}"
logger.error(msg) logger.error(msg)
logger.debug(f"Generated args for subprocess: {generated_args}")
raise ValueError(msg) raise ValueError(msg)
return generated_args return generated_args
def get_raw_args(self): def get_raw_args(self):
raw_args_string = self.args.get('raw', None) raw_args_string = self.args.get('raw', '')
raw_args = None raw_args = None
if raw_args_string: if raw_args_string:
import shlex import shlex
raw_args = shlex.split(raw_args_string) raw_args = shlex.split(raw_args_string)
return raw_args return raw_args
def generate_worker_subprocess(self):
raise NotImplementedError("generate_worker_subprocess not implemented")
def log_path(self): def log_path(self):
filename = (self.name or os.path.basename(self.input_path)) + '_' + \ filename = (self.name or os.path.basename(self.input_path)) + '_' + \
self.date_created.strftime("%Y.%m.%d_%H.%M.%S") + '.log' self.date_created.strftime("%Y.%m.%d_%H.%M.%S") + '.log'
@@ -150,105 +204,236 @@ class BaseRenderWorker(Base):
def start(self): def start(self):
if self.status not in [RenderStatus.SCHEDULED, RenderStatus.NOT_STARTED]: if self.status not in [RenderStatus.SCHEDULED, RenderStatus.NOT_STARTED, RenderStatus.CONFIGURING]:
logger.error(f"Trying to start job with status: {self.status}") logger.error(f"Trying to start job with status: {self.status}")
return return
if not os.path.exists(self.input_path): if not os.path.exists(self.input_path):
self.status = RenderStatus.ERROR self.status = RenderStatus.ERROR
msg = 'Cannot find input path: {}'.format(self.input_path) msg = f'Cannot find input path: {self.input_path}'
logger.error(msg) logger.error(msg)
self.errors.append(msg) self.errors.append(msg)
return return
if not os.path.exists(self.renderer_path): if not os.path.exists(self.renderer_path):
self.status = RenderStatus.ERROR self.status = RenderStatus.ERROR
msg = 'Cannot find render engine path for {}'.format(self.engine.name()) msg = f'Cannot find render engine path for {self.engine.name()}'
logger.error(msg) logger.error(msg)
self.errors.append(msg) self.errors.append(msg)
return return
self.status = RenderStatus.RUNNING self.status = RenderStatus.RUNNING if not self.children else RenderStatus.WAITING_FOR_SUBJOBS
self.start_time = datetime.now() self.start_time = datetime.now()
logger.info(f'Starting {self.engine.name()} {self.renderer_version} Render for {self.input_path} | '
f'Frame Count: {self.total_frames}')
self.__thread.start() self.__thread.start()
def run(self): # handle multiple attempts at running subprocess
def __run__subprocess_cycle(self, log_file):
subprocess_cmds = self.generate_subprocess()
initial_file_count = len(self.file_list())
failed_attempts = 0
log_file.write(f"Running command: {subprocess_cmds}\n")
log_file.write('=' * 80 + '\n\n')
while True:
# Log attempt #
if failed_attempts:
if failed_attempts >= self.maximum_attempts:
err_msg = f"Maximum attempts exceeded ({self.maximum_attempts})"
logger.error(err_msg)
self.status = RenderStatus.ERROR
self.errors.append(err_msg)
return
else:
log_file.write(f'\n{"=" * 20} Attempt #{failed_attempts + 1} {"=" * 20}\n\n')
logger.warning(f"Restarting render - Attempt #{failed_attempts + 1}")
self.status = RenderStatus.RUNNING
return_code = self.__setup_and_run_process(log_file, subprocess_cmds)
message = f"{'=' * 50}\n\n{self.engine.name()} render ended with code {return_code} " \
f"after {self.time_elapsed()}\n\n"
log_file.write(message)
# don't try again if we've been cancelled
if self.status in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
return
# if file output hasn't increased, return as error, otherwise restart process.
file_count_has_increased = len(self.file_list()) > initial_file_count
if (self.status == RenderStatus.RUNNING) and file_count_has_increased and not return_code:
break
if return_code:
err_msg = f"{self.engine.name()} render failed with code {return_code}"
logger.error(err_msg)
self.errors.append(err_msg)
# handle instances where renderer exits ok but doesnt generate files
if not return_code and not file_count_has_increased:
err_msg = (f"{self.engine.name()} render exited ok, but file count has not increased. "
f"Count is still {len(self.file_list())}")
log_file.write(f'Error: {err_msg}\n\n')
self.errors.append(err_msg)
# only count the attempt as failed if renderer creates no output - reset counter on successful output
failed_attempts = 0 if file_count_has_increased else failed_attempts + 1
def __run__wait_for_subjobs(self, logfile):
from src.distributed_job_manager import DistributedJobManager
DistributedJobManager.wait_for_subjobs(parent_job=self)
@staticmethod
def log_and_print(message, log_file, level='info'):
if level == 'debug':
logger.debug(message)
elif level == 'error':
logger.error(message)
else:
logger.info(message)
log_file.write(f"{message}\n")
def __run(self):
# Setup logging # Setup logging
log_dir = os.path.dirname(self.log_path()) log_dir = os.path.dirname(self.log_path())
os.makedirs(log_dir, exist_ok=True) os.makedirs(log_dir, exist_ok=True)
subprocess_cmds = self.generate_subprocess() with open(self.log_path(), "a") as log_file:
initial_file_count = len(self.file_list())
attempt_number = 0
with open(self.log_path(), "a") as f: self.log_and_print(f"{self.start_time.isoformat()} - Starting "
f"{self.engine.name()} {self.renderer_version} render job for {self.name} "
f"({self.input_path})", log_file)
log_file.write(f"\n")
if not self.children:
self.__run__subprocess_cycle(log_file)
else:
self.__run__wait_for_subjobs(log_file)
f.write(f"{self.start_time.isoformat()} - Starting {self.engine.name()} {self.renderer_version} " # Validate Output - End if missing frames
f"render for {self.input_path}\n\n") if self.status == RenderStatus.RUNNING:
f.write(f"Running command: {subprocess_cmds}\n") file_list_length = len(self.file_list())
f.write('=' * 80 + '\n\n') expected_list_length = (self.end_frame - self.start_frame + 1) if self.end_frame else 1
while True: msg = f"Frames: Expected ({expected_list_length}) vs actual ({file_list_length}) for {self}"
# Log attempt # self.log_and_print(msg, log_file, 'debug')
if attempt_number:
f.write(f'\n{"=" * 80} Attempt #{attempt_number} {"=" * 30}\n\n')
logger.warning(f"Restarting render - Attempt #{attempt_number}")
attempt_number += 1
if file_list_length not in (expected_list_length, 1):
msg = f"Missing frames: Expected ({expected_list_length}) vs actual ({file_list_length})"
self.log_and_print(msg, log_file, 'error')
self.errors.append(msg)
self.status = RenderStatus.ERROR
# todo: create new subjob to generate missing frames
# cleanup and close if cancelled / error
if self.status in [RenderStatus.CANCELLED, RenderStatus.ERROR]:
self.end_time = datetime.now()
message = f"{self.engine.name()} render ended with status '{self.status.value}' " \
f"after {self.time_elapsed()}"
self.log_and_print(message, log_file)
log_file.close()
return
# Post Render Work
if not self.parent:
logger.debug(f"Starting post-processing work for {self}")
self.log_and_print(f"Starting post-processing work for {self}", log_file, 'debug')
self.post_processing()
self.log_and_print(f"Completed post-processing work for {self}", log_file, 'debug')
self.status = RenderStatus.COMPLETED
self.end_time = datetime.now()
message = f"Render {self.name} completed successfully after {self.time_elapsed()}"
self.log_and_print(message, log_file)
def __setup_and_run_process(self, f, subprocess_cmds):
def watchdog():
logger.debug(f'Starting process watchdog for {self} with {self.watchdog_timeout}s timeout')
while self.__process.poll() is None:
time_since_last_update = time.time() - self.__last_output_time
if time_since_last_update > self.watchdog_timeout:
logger.error(f"Process for {self} terminated due to exceeding timeout ({self.watchdog_timeout}s)")
self.__kill_process()
break
# logger.debug(f'Watchdog for {self} - Time since last update: {time_since_last_update}')
time.sleep(1)
logger.debug(f'Stopping process watchdog for {self}')
return_code = -1
watchdog_thread = threading.Thread(target=watchdog)
watchdog_thread.daemon = True
try:
# Start process and get updates # Start process and get updates
if os.name == 'posix': # linux / mac
self.__process = subprocess.Popen(subprocess_cmds, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, self.__process = subprocess.Popen(subprocess_cmds, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
universal_newlines=False) universal_newlines=False, preexec_fn=os.setsid)
else: # windows
creationflags = subprocess.CREATE_NEW_PROCESS_GROUP | subprocess.CREATE_NO_WINDOW
self.__process = subprocess.Popen(subprocess_cmds, stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
universal_newlines=False,
creationflags=creationflags)
# Start watchdog
self.__last_output_time = time.time()
watchdog_thread.start()
for c in io.TextIOWrapper(self.__process.stdout, encoding="utf-8"): # or another encoding for c in io.TextIOWrapper(self.__process.stdout, encoding="utf-8"): # or another encoding
f.write(c)
self.last_output = c.strip() self.last_output = c.strip()
self.__last_output_time = time.time()
try:
f.write(c)
f.flush()
os.fsync(f.fileno())
except Exception as e:
logger.error(f"Error saving log to disk: {e}")
try:
self._parse_stdout(c.strip()) self._parse_stdout(c.strip())
except Exception as e:
logger.error(f'Error parsing stdout: {e}')
f.write('\n') f.write('\n')
# Check return codes and process # Check return codes and process
return_code = self.__process.wait() return_code = self.__process.wait()
self.end_time = datetime.now() except Exception as e:
message = f'Uncaught error running render process: {e}'
if self.status in [RenderStatus.CANCELLED, RenderStatus.ERROR]: # user cancelled
message = f"{self.engine.name()} render ended with status '{self.status}' " \
f"after {self.time_elapsed()}"
f.write(message) f.write(message)
logger.exception(message)
self.__kill_process()
# let watchdog end before continuing - prevents multiple watchdogs running when process restarts
if watchdog_thread.is_alive():
watchdog_thread.join()
return return_code
def __kill_process(self):
try:
if self.__process.poll():
return return
logger.debug(f"Trying to kill process {self.__process}")
if not return_code: self.__process.terminate()
message = f"{'=' * 50}\n\n{self.engine.name()} render completed successfully in {self.time_elapsed()}" self.__process.kill()
f.write(message) if os.name == 'posix': # linux / macos
break os.killpg(os.getpgid(self.__process.pid), signal.SIGTERM)
os.killpg(os.getpgid(self.__process.pid), signal.SIGKILL)
# Handle non-zero return codes else: # windows
message = f"{'=' * 50}\n\n{self.engine.name()} render failed with code {return_code} " \ parent = psutil.Process(self.__process.pid)
f"after {self.time_elapsed()}" for child in parent.children(recursive=True):
f.write(message) child.kill()
self.errors.append(message) self.__process.wait(timeout=5)
logger.debug(f"Process ended with status {self.__process.poll()}")
# if file output hasn't increased, return as error, otherwise restart process. except (ProcessLookupError, AttributeError, psutil.NoSuchProcess):
if len(self.file_list()) <= initial_file_count:
self.status = RenderStatus.ERROR
return
if self.children:
from src.distributed_job_manager import DistributedJobManager
DistributedJobManager.wait_for_subjobs(local_job=self)
# Post Render Work
logger.debug("Starting post-processing work")
self.post_processing()
self.status = RenderStatus.COMPLETED
logger.info(f"Render {self.id}-{self.name} completed successfully after {self.time_elapsed()}")
def post_processing(self):
pass pass
except Exception as e:
logger.error(f"Error stopping the process: {e}")
def is_running(self): def is_running(self):
if self.__thread: if hasattr(self, '__thread'):
return self.__thread.is_alive() return self.__thread.is_alive()
return False return False
@@ -259,15 +444,11 @@ class BaseRenderWorker(Base):
self.stop(is_error=True) self.stop(is_error=True)
def stop(self, is_error=False): def stop(self, is_error=False):
if hasattr(self, '__process'): logger.debug(f"Stopping {self}")
try:
process = psutil.Process(self.__process.pid) # cleanup status
for proc in process.children(recursive=True): if self.status in [RenderStatus.RUNNING, RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED,
proc.kill() RenderStatus.CONFIGURING]:
process.kill()
except Exception as e:
logger.debug(f"Error stopping the process: {e}")
if self.status in [RenderStatus.RUNNING, RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED]:
if is_error: if is_error:
err_message = self.errors[-1] if self.errors else 'Unknown error' err_message = self.errors[-1] if self.errors else 'Unknown error'
logger.error(f"Halting render due to error: {err_message}") logger.error(f"Halting render due to error: {err_message}")
@@ -275,11 +456,9 @@ class BaseRenderWorker(Base):
else: else:
self.status = RenderStatus.CANCELLED self.status = RenderStatus.CANCELLED
def percent_complete(self): self.__kill_process()
return 0 if self.is_running(): # allow the log files to close
self.__thread.join(timeout=5)
def _parse_stdout(self, line):
raise NotImplementedError("_parse_stdout not implemented")
def time_elapsed(self): def time_elapsed(self):
return get_time_elapsed(self.start_time, self.end_time) return get_time_elapsed(self.start_time, self.end_time)
@@ -287,7 +466,11 @@ class BaseRenderWorker(Base):
def file_list(self): def file_list(self):
try: try:
job_dir = os.path.dirname(self.output_path) job_dir = os.path.dirname(self.output_path)
file_list = [os.path.join(job_dir, file) for file in os.listdir(job_dir)] file_list = [
os.path.join(job_dir, file)
for file in os.listdir(job_dir)
if not file.startswith('.') # Ignore hidden files
]
file_list.sort() file_list.sort()
return file_list return file_list
except FileNotFoundError: except FileNotFoundError:
@@ -297,6 +480,7 @@ class BaseRenderWorker(Base):
job_dict = { job_dict = {
'id': self.id, 'id': self.id,
'name': self.name, 'name': self.name,
'hostname': self.hostname,
'input_path': self.input_path, 'input_path': self.input_path,
'output_path': self.output_path, 'output_path': self.output_path,
'priority': self.priority, 'priority': self.priority,
@@ -316,7 +500,8 @@ class BaseRenderWorker(Base):
'end_frame': self.end_frame, 'end_frame': self.end_frame,
'total_frames': self.total_frames, 'total_frames': self.total_frames,
'last_output': getattr(self, 'last_output', None), 'last_output': getattr(self, 'last_output', None),
'log_path': self.log_path() 'log_path': self.log_path(),
'args': self.args
} }
# convert to json and back to auto-convert dates to iso format # convert to json and back to auto-convert dates to iso format
+138 -75
View File
@@ -2,6 +2,7 @@ import logging
import os import os
import shutil import shutil
import threading import threading
import concurrent.futures
from src.engines.blender.blender_engine import Blender from src.engines.blender.blender_engine import Blender
from src.engines.ffmpeg.ffmpeg_engine import FFMPEG from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
@@ -11,6 +12,9 @@ logger = logging.getLogger()
class EngineManager: class EngineManager:
"""Class that manages different versions of installed renderers and handles fetching and downloading new versions,
if possible.
"""
engines_path = None engines_path = None
download_tasks = [] download_tasks = []
@@ -19,6 +23,10 @@ class EngineManager:
def supported_engines(): def supported_engines():
return [Blender, FFMPEG] return [Blender, FFMPEG]
@classmethod
def downloadable_engines(cls):
return [engine for engine in cls.supported_engines() if hasattr(engine, "downloader") and engine.downloader()]
@classmethod @classmethod
def engine_with_name(cls, engine_name): def engine_with_name(cls, engine_name):
for obj in cls.supported_engines(): for obj in cls.supported_engines():
@@ -26,78 +34,109 @@ class EngineManager:
return obj return obj
@classmethod @classmethod
def all_engines(cls): def get_engines(cls, filter_name=None, include_corrupt=False, ignore_system=False):
if not cls.engines_path: if not cls.engines_path:
raise FileNotFoundError("Engines path must be set before requesting downloads") raise FileNotFoundError("Engine path is not set")
# Parse downloaded engine directory # Parse downloaded engine directory
results = [] results = []
try: try:
all_items = os.listdir(cls.engines_path) all_items = os.listdir(cls.engines_path)
all_directories = [item for item in all_items if os.path.isdir(os.path.join(cls.engines_path, item))] all_directories = [item for item in all_items if os.path.isdir(os.path.join(cls.engines_path, item))]
keys = ["engine", "version", "system_os", "cpu"] # Define keys for result dictionary
for directory in all_directories: for directory in all_directories:
# Split the input string by dashes to get segments # Split directory name into segments
segments = directory.split('-') segments = directory.split('-')
# Create a dictionary mapping keys to corresponding segments
# Create a dictionary with named keys
keys = ["engine", "version", "system_os", "cpu"]
result_dict = {keys[i]: segments[i] for i in range(min(len(keys), len(segments)))} result_dict = {keys[i]: segments[i] for i in range(min(len(keys), len(segments)))}
result_dict['type'] = 'managed' result_dict['type'] = 'managed'
# Figure out the binary name for the path # Initialize binary_name with engine name
binary_name = result_dict['engine'].lower() binary_name = result_dict['engine'].lower()
for eng in cls.supported_engines(): # Determine the correct binary name based on the engine and system_os
if eng.name().lower() == result_dict['engine']: eng = cls.engine_with_name(result_dict['engine'])
binary_name = eng.binary_names.get(result_dict['system_os'], binary_name) binary_name = eng.binary_names.get(result_dict['system_os'], binary_name)
# Find path to binary # Find the path to the binary file
path = None path = next(
for root, _, files in os.walk(system_safe_path(os.path.join(cls.engines_path, directory))): (os.path.join(root, binary_name) for root, _, files in
if binary_name in files: os.walk(system_safe_path(os.path.join(cls.engines_path, directory))) if binary_name in files),
path = os.path.join(root, binary_name) None
break )
result_dict['path'] = path result_dict['path'] = path
# fetch version number from binary - helps detect corrupted downloads
binary_version = eng(path).version()
if not binary_version:
logger.warning(f"Possible corrupt {eng.name()} {result_dict['version']} install detected: {path}")
if not include_corrupt:
continue
result_dict['version'] = binary_version or 'error'
# Add the result dictionary to results if it matches the filter_name or if no filter is applied
if not filter_name or filter_name == result_dict['engine']:
results.append(result_dict) results.append(result_dict)
except FileNotFoundError as e: except FileNotFoundError as e:
logger.warning(f"Cannot find local engines download directory: {e}") logger.warning(f"Cannot find local engines download directory: {e}")
# add system installs to this list # add system installs to this list - use bg thread because it can be slow
for eng in cls.supported_engines(): def fetch_engine_details(eng, include_corrupt=False):
if eng.default_renderer_path(): version = eng().version()
results.append({'engine': eng.name(), 'version': eng().version(), if not version and not include_corrupt:
return
return {
'engine': eng.name(),
'version': version or 'error',
'system_os': current_system_os(), 'system_os': current_system_os(),
'cpu': current_system_cpu(), 'cpu': current_system_cpu(),
'path': eng.default_renderer_path(), 'type': 'system'}) 'path': eng.default_renderer_path(),
'type': 'system'
}
if not ignore_system:
with concurrent.futures.ThreadPoolExecutor() as executor:
futures = {
executor.submit(fetch_engine_details, eng, include_corrupt): eng.name()
for eng in cls.supported_engines()
if eng.default_renderer_path() and (not filter_name or filter_name == eng.name())
}
for future in concurrent.futures.as_completed(futures):
result = future.result()
if result:
results.append(result)
return results return results
@classmethod @classmethod
def all_versions_for_engine(cls, engine): def all_versions_for_engine(cls, engine_name, include_corrupt=False, ignore_system=False):
return [x for x in cls.all_engines() if x['engine'] == engine] versions = cls.get_engines(filter_name=engine_name, include_corrupt=include_corrupt, ignore_system=ignore_system)
sorted_versions = sorted(versions, key=lambda x: x['version'], reverse=True)
return sorted_versions
@classmethod @classmethod
def newest_engine_version(cls, engine, system_os=None, cpu=None): def newest_engine_version(cls, engine, system_os=None, cpu=None, ignore_system=None):
system_os = system_os or current_system_os() system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu() cpu = cpu or current_system_cpu()
try: try:
filtered = [x for x in cls.all_engines() if x['engine'] == engine and x['system_os'] == system_os and x['cpu'] == cpu] filtered = [x for x in cls.all_versions_for_engine(engine, ignore_system=ignore_system)
versions = sorted(filtered, key=lambda x: x['version'], reverse=True) if x['system_os'] == system_os and x['cpu'] == cpu]
return versions[0] return filtered[0]
except IndexError: except IndexError:
logger.error(f"Cannot find newest engine version for {engine}-{system_os}-{cpu}") logger.error(f"Cannot find newest engine version for {engine}-{system_os}-{cpu}")
return None return None
@classmethod @classmethod
def is_version_downloaded(cls, engine, version, system_os=None, cpu=None): def is_version_downloaded(cls, engine, version, system_os=None, cpu=None, ignore_system=False):
system_os = system_os or current_system_os() system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu() cpu = cpu or current_system_cpu()
filtered = [x for x in cls.all_engines() if filtered = [x for x in cls.get_engines(filter_name=engine, ignore_system=ignore_system) if
x['engine'] == engine and x['system_os'] == system_os and x['cpu'] == cpu and x['version'] == version] x['system_os'] == system_os and x['cpu'] == cpu and x['version'] == version]
return filtered[0] if filtered else False return filtered[0] if filtered else False
@classmethod @classmethod
@@ -106,6 +145,7 @@ class EngineManager:
downloader = cls.engine_with_name(engine).downloader() downloader = cls.engine_with_name(engine).downloader()
return downloader.version_is_available_to_download(version=version, system_os=system_os, cpu=cpu) return downloader.version_is_available_to_download(version=version, system_os=system_os, cpu=cpu)
except Exception as e: except Exception as e:
logger.debug(f"Exception in version_is_available_to_download: {e}")
return None return None
@classmethod @classmethod
@@ -114,10 +154,11 @@ class EngineManager:
downloader = cls.engine_with_name(engine).downloader() downloader = cls.engine_with_name(engine).downloader()
return downloader.find_most_recent_version(system_os=system_os, cpu=cpu) return downloader.find_most_recent_version(system_os=system_os, cpu=cpu)
except Exception as e: except Exception as e:
logger.debug(f"Exception in find_most_recent_version: {e}")
return None return None
@classmethod @classmethod
def is_already_downloading(cls, engine, version, system_os=None, cpu=None): def get_existing_download_task(cls, engine, version, system_os=None, cpu=None):
for task in cls.download_tasks: for task in cls.download_tasks:
task_parts = task.name.split('-') task_parts = task.name.split('-')
task_engine, task_version, task_system_os, task_cpu = task_parts[:4] task_engine, task_version, task_system_os, task_cpu = task_parts[:4]
@@ -125,26 +166,17 @@ class EngineManager:
if engine == task_engine and version == task_version: if engine == task_engine and version == task_version:
if system_os in (task_system_os, None) and cpu in (task_cpu, None): if system_os in (task_system_os, None) and cpu in (task_cpu, None):
return task return task
return False return None
@classmethod @classmethod
def download_engine(cls, engine, version, system_os=None, cpu=None, background=False): def download_engine(cls, engine, version, system_os=None, cpu=None, background=False, ignore_system=False):
def download_engine_task(engine, version, system_os=None, cpu=None):
existing_download = cls.is_version_downloaded(engine, version, system_os, cpu)
if existing_download:
logger.info(f"Requested download of {engine} {version}, but local copy already exists")
return existing_download
# Get the appropriate downloader class based on the engine type
cls.engine_with_name(engine).downloader().download_engine(version, download_location=cls.engines_path,
system_os=system_os, cpu=cpu, timeout=300)
engine_to_download = cls.engine_with_name(engine) engine_to_download = cls.engine_with_name(engine)
existing_task = cls.is_already_downloading(engine, version, system_os, cpu) existing_task = cls.get_existing_download_task(engine, version, system_os, cpu)
if existing_task: if existing_task:
logger.debug(f"Already downloading {engine} {version}") logger.debug(f"Already downloading {engine} {version}")
if not background: if not background:
existing_task.join() # If download task exists, wait until its done downloading existing_task.join() # If download task exists, wait until it's done downloading
return return
elif not engine_to_download.downloader(): elif not engine_to_download.downloader():
logger.warning("No valid downloader for this engine. Please update this software manually.") logger.warning("No valid downloader for this engine. Please update this software manually.")
@@ -152,21 +184,19 @@ class EngineManager:
elif not cls.engines_path: elif not cls.engines_path:
raise FileNotFoundError("Engines path must be set before requesting downloads") raise FileNotFoundError("Engines path must be set before requesting downloads")
thread = threading.Thread(target=download_engine_task, args=(engine, version, system_os, cpu), thread = EngineDownloadWorker(engine, version, system_os, cpu)
name=f'{engine}-{version}-{system_os}-{cpu}')
cls.download_tasks.append(thread) cls.download_tasks.append(thread)
thread.start() thread.start()
if background: if background:
return thread return thread
else:
thread.join() thread.join()
found_engine = cls.is_version_downloaded(engine, version, system_os, cpu) # Check that engine downloaded found_engine = cls.is_version_downloaded(engine, version, system_os, cpu, ignore_system) # Check that engine downloaded
if not found_engine: if not found_engine:
logger.error(f"Error downloading {engine}") logger.error(f"Error downloading {engine}")
return found_engine return found_engine
@classmethod @classmethod
def delete_engine_download(cls, engine, version, system_os=None, cpu=None): def delete_engine_download(cls, engine, version, system_os=None, cpu=None):
logger.info(f"Requested deletion of engine: {engine}-{version}") logger.info(f"Requested deletion of engine: {engine}-{version}")
@@ -188,25 +218,20 @@ class EngineManager:
return False return False
@classmethod @classmethod
def update_all_engines(cls): def is_engine_update_available(cls, engine_class, ignore_system_installs=False):
def engine_update_task(engine): logger.debug(f"Checking for updates to {engine_class.name()}")
logger.debug(f"Checking for updates to {engine.name()}") latest_version = engine_class.downloader().find_most_recent_version()
latest_version = engine.downloader().find_most_recent_version()
if latest_version:
logger.debug(f"Latest version of {engine.name()} available: {latest_version.get('version')}")
if not cls.is_version_downloaded(engine.name(), latest_version.get('version')):
logger.info(f"Downloading latest version of {engine.name()}...")
cls.download_engine(engine=engine.name(), version=latest_version['version'], background=True)
else:
logger.warning(f"Unable to get check for updates for {engine.name()}")
logger.info(f"Checking for updates for render engines...") if not latest_version:
threads = [] logger.warning(f"Could not find most recent version of {engine_class.name()} to download")
for engine in cls.supported_engines(): return
if engine.downloader():
thread = threading.Thread(target=engine_update_task, args=(engine,)) version_num = latest_version.get('version')
threads.append(thread) if cls.is_version_downloaded(engine_class.name(), version_num, ignore_system=ignore_system_installs):
thread.start() logger.debug(f"Latest version of {engine_class.name()} ({version_num}) already downloaded")
return
return latest_version
@classmethod @classmethod
@@ -215,13 +240,13 @@ class EngineManager:
worker_class = cls.engine_with_name(renderer).worker_class() worker_class = cls.engine_with_name(renderer).worker_class()
# check to make sure we have versions installed # check to make sure we have versions installed
all_versions = EngineManager.all_versions_for_engine(renderer) all_versions = cls.all_versions_for_engine(renderer)
if not all_versions: if not all_versions:
raise FileNotFoundError(f"Cannot find any installed {renderer} engines") raise FileNotFoundError(f"Cannot find any installed {renderer} engines")
# Find the path to the requested engine version or use default # Find the path to the requested engine version or use default
engine_path = None if engine_version else all_versions[0]['path'] engine_path = None
if engine_version: if engine_version and engine_version != 'latest':
for ver in all_versions: for ver in all_versions:
if ver['version'] == engine_version: if ver['version'] == engine_version:
engine_path = ver['path'] engine_path = ver['path']
@@ -229,11 +254,14 @@ class EngineManager:
# Download the required engine if not found locally # Download the required engine if not found locally
if not engine_path: if not engine_path:
download_result = EngineManager.download_engine(renderer, engine_version) download_result = cls.download_engine(renderer, engine_version)
if not download_result: if not download_result:
raise FileNotFoundError(f"Cannot download requested version: {renderer} {engine_version}") raise FileNotFoundError(f"Cannot download requested version: {renderer} {engine_version}")
engine_path = download_result['path'] engine_path = download_result['path']
logger.info("Engine downloaded. Creating worker.") logger.info("Engine downloaded. Creating worker.")
else:
logger.debug(f"Using latest engine version ({all_versions[0]['version']})")
engine_path = all_versions[0]['path']
if not engine_path: if not engine_path:
raise FileNotFoundError(f"Cannot find requested engine version {engine_version}") raise FileNotFoundError(f"Cannot find requested engine version {engine_version}")
@@ -243,15 +271,50 @@ class EngineManager:
@classmethod @classmethod
def engine_for_project_path(cls, path): def engine_for_project_path(cls, path):
name, extension = os.path.splitext(path) _, extension = os.path.splitext(path)
extension = extension.lower().strip('.') extension = extension.lower().strip('.')
for engine in cls.supported_engines(): for engine in cls.supported_engines():
if extension in engine.supported_extensions(): if extension in engine().supported_extensions():
return engine return engine
undefined_renderer_support = [x for x in cls.supported_engines() if not x.supported_extensions()] undefined_renderer_support = [x for x in cls.supported_engines() if not x().supported_extensions()]
return undefined_renderer_support[0] return undefined_renderer_support[0]
class EngineDownloadWorker(threading.Thread):
"""A thread worker for downloading a specific version of a rendering engine.
This class handles the process of downloading a rendering engine in a separate thread,
ensuring that the download process does not block the main application.
Attributes:
engine (str): The name of the rendering engine to download.
version (str): The version of the rendering engine to download.
system_os (str, optional): The operating system for which to download the engine. Defaults to current OS type.
cpu (str, optional): Requested CPU architecture. Defaults to system CPU type.
"""
def __init__(self, engine, version, system_os=None, cpu=None):
super().__init__()
self.engine = engine
self.version = version
self.system_os = system_os
self.cpu = cpu
def run(self):
existing_download = EngineManager.is_version_downloaded(self.engine, self.version, self.system_os, self.cpu,
ignore_system=True)
if existing_download:
logger.info(f"Requested download of {self.engine} {self.version}, but local copy already exists")
return existing_download
# Get the appropriate downloader class based on the engine type
EngineManager.engine_with_name(self.engine).downloader().download_engine(
self.version, download_location=EngineManager.engines_path, system_os=self.system_os, cpu=self.cpu,
timeout=300)
# remove itself from the downloader list
EngineManager.download_tasks.remove(self)
if __name__ == '__main__': if __name__ == '__main__':
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s') logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
@@ -259,4 +322,4 @@ if __name__ == '__main__':
# EngineManager.delete_engine_download('blender', '3.2.1', 'macos', 'a') # EngineManager.delete_engine_download('blender', '3.2.1', 'macos', 'a')
EngineManager.engines_path = "/Users/brettwilliams/zordon-uploads/engines" EngineManager.engines_path = "/Users/brettwilliams/zordon-uploads/engines"
# print(EngineManager.is_version_downloaded("ffmpeg", "6.0")) # print(EngineManager.is_version_downloaded("ffmpeg", "6.0"))
print(EngineManager.all_engines()) print(EngineManager.get_engines())
+8 -7
View File
@@ -90,7 +90,7 @@ class FFMPEGDownloader(EngineDownloader):
return releases return releases
@classmethod @classmethod
def __all_versions(cls, system_os=None, cpu=None): def all_versions(cls, system_os=None, cpu=None):
system_os = system_os or current_system_os() system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu() cpu = cpu or current_system_cpu()
versions_per_os = {'linux': cls.__get_linux_versions, 'macos': cls.__get_macos_versions, versions_per_os = {'linux': cls.__get_linux_versions, 'macos': cls.__get_macos_versions,
@@ -131,14 +131,14 @@ class FFMPEGDownloader(EngineDownloader):
try: try:
system_os = system_os or current_system_os() system_os = system_os or current_system_os()
cpu = cpu or current_system_cpu() cpu = cpu or current_system_cpu()
return cls.__all_versions(system_os, cpu)[0] return cls.all_versions(system_os, cpu)[0]
except (IndexError, requests.exceptions.RequestException): except (IndexError, requests.exceptions.RequestException) as e:
logger.error(f"Cannot get most recent version of ffmpeg") logger.error(f"Cannot get most recent version of ffmpeg: {e}")
return {} return {}
@classmethod @classmethod
def version_is_available_to_download(cls, version, system_os=None, cpu=None): def version_is_available_to_download(cls, version, system_os=None, cpu=None):
for ver in cls.__all_versions(system_os, cpu): for ver in cls.all_versions(system_os, cpu):
if ver['version'] == version: if ver['version'] == version:
return ver return ver
return None return None
@@ -149,7 +149,7 @@ class FFMPEGDownloader(EngineDownloader):
cpu = cpu or current_system_cpu() cpu = cpu or current_system_cpu()
# Verify requested version is available # Verify requested version is available
found_version = [item for item in cls.__all_versions(system_os, cpu) if item['version'] == version] found_version = [item for item in cls.all_versions(system_os, cpu) if item['version'] == version]
if not found_version: if not found_version:
logger.error(f"Cannot find FFMPEG version {version} for {system_os} and {cpu}") logger.error(f"Cannot find FFMPEG version {version} for {system_os} and {cpu}")
return return
@@ -182,4 +182,5 @@ if __name__ == "__main__":
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s') logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
# print(FFMPEGDownloader.download_engine('6.0', '/Users/brett/zordon-uploads/engines/')) # print(FFMPEGDownloader.download_engine('6.0', '/Users/brett/zordon-uploads/engines/'))
# print(FFMPEGDownloader.find_most_recent_version(system_os='linux')) # print(FFMPEGDownloader.find_most_recent_version(system_os='linux'))
print(FFMPEGDownloader.download_engine(version='6.0', download_location='/Users/brett/zordon-uploads/engines/', system_os='linux', cpu='x64')) print(FFMPEGDownloader.download_engine(version='6.0', download_location='/Users/brett/zordon-uploads/engines/',
system_os='linux', cpu='x64'))
+27 -19
View File
@@ -3,9 +3,10 @@ import re
from src.engines.core.base_engine import * from src.engines.core.base_engine import *
_creationflags = subprocess.CREATE_NO_WINDOW if platform.system() == 'Windows' else 0
class FFMPEG(BaseRenderEngine): class FFMPEG(BaseRenderEngine):
binary_names = {'linux': 'ffmpeg', 'windows': 'ffmpeg.exe', 'macos': 'ffmpeg'} binary_names = {'linux': 'ffmpeg', 'windows': 'ffmpeg.exe', 'macos': 'ffmpeg'}
@staticmethod @staticmethod
@@ -18,11 +19,14 @@ class FFMPEG(BaseRenderEngine):
from src.engines.ffmpeg.ffmpeg_worker import FFMPEGRenderWorker from src.engines.ffmpeg.ffmpeg_worker import FFMPEGRenderWorker
return FFMPEGRenderWorker return FFMPEGRenderWorker
@classmethod def ui_options(self):
def supported_extensions(cls): from src.engines.ffmpeg.ffmpeg_ui import FFMPEGUI
help_text = (subprocess.check_output([cls().renderer_path(), '-h', 'full'], stderr=subprocess.STDOUT) return FFMPEGUI.get_options(self)
.decode('utf-8'))
found = re.findall('extensions that .* is allowed to access \(default "(.*)"', help_text) def supported_extensions(self):
help_text = (subprocess.check_output([self.renderer_path(), '-h', 'full'], stderr=subprocess.STDOUT,
creationflags=_creationflags).decode('utf-8'))
found = re.findall(r'extensions that .* is allowed to access \(default "(.*)"', help_text)
found_extensions = set() found_extensions = set()
for match in found: for match in found:
found_extensions.update(match.split(',')) found_extensions.update(match.split(','))
@@ -31,9 +35,9 @@ class FFMPEG(BaseRenderEngine):
def version(self): def version(self):
version = None version = None
try: try:
ver_out = subprocess.check_output([self.renderer_path(), '-version'], ver_out = subprocess.check_output([self.renderer_path(), '-version'], timeout=SUBPROCESS_TIMEOUT,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8') creationflags=_creationflags).decode('utf-8')
match = re.match(".*version\s*(\S+)\s*Copyright", ver_out) match = re.match(r".*version\s*([\w.*]+)\W*", ver_out)
if match: if match:
version = match.groups()[0] version = match.groups()[0]
except Exception as e: except Exception as e:
@@ -47,8 +51,9 @@ class FFMPEG(BaseRenderEngine):
'ffprobe', '-v', 'quiet', '-print_format', 'json', 'ffprobe', '-v', 'quiet', '-print_format', 'json',
'-show_streams', '-select_streams', 'v', project_path '-show_streams', '-select_streams', 'v', project_path
] ]
result = subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True) output = subprocess.check_output(cmd, stderr=subprocess.STDOUT, text=True,
video_info = json.loads(result.stdout) creationflags=_creationflags)
video_info = json.loads(output)
# Extract the necessary information # Extract the necessary information
video_stream = video_info['streams'][0] video_stream = video_info['streams'][0]
@@ -78,8 +83,8 @@ class FFMPEG(BaseRenderEngine):
def get_encoders(self): def get_encoders(self):
raw_stdout = subprocess.check_output([self.renderer_path(), '-encoders'], stderr=subprocess.DEVNULL, raw_stdout = subprocess.check_output([self.renderer_path(), '-encoders'], stderr=subprocess.DEVNULL,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8') timeout=SUBPROCESS_TIMEOUT, creationflags=_creationflags).decode('utf-8')
pattern = '(?P<type>[VASFXBD.]{6})\s+(?P<name>\S{2,})\s+(?P<description>.*)' pattern = r'(?P<type>[VASFXBD.]{6})\s+(?P<name>\S{2,})\s+(?P<description>.*)'
encoders = [m.groupdict() for m in re.finditer(pattern, raw_stdout)] encoders = [m.groupdict() for m in re.finditer(pattern, raw_stdout)]
return encoders return encoders
@@ -90,8 +95,9 @@ class FFMPEG(BaseRenderEngine):
def get_all_formats(self): def get_all_formats(self):
try: try:
formats_raw = subprocess.check_output([self.renderer_path(), '-formats'], stderr=subprocess.DEVNULL, formats_raw = subprocess.check_output([self.renderer_path(), '-formats'], stderr=subprocess.DEVNULL,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8') timeout=SUBPROCESS_TIMEOUT,
pattern = '(?P<type>[DE]{1,2})\s+(?P<id>\S{2,})\s+(?P<name>.*)' creationflags=_creationflags).decode('utf-8')
pattern = r'(?P<type>[DE]{1,2})\s+(?P<id>\S{2,})\s+(?P<name>.*)'
all_formats = [m.groupdict() for m in re.finditer(pattern, formats_raw)] all_formats = [m.groupdict() for m in re.finditer(pattern, formats_raw)]
return all_formats return all_formats
except Exception as e: except Exception as e:
@@ -102,7 +108,8 @@ class FFMPEG(BaseRenderEngine):
# Extract the common extension using regex # Extract the common extension using regex
muxer_flag = 'muxer' if 'E' in ffmpeg_format['type'] else 'demuxer' muxer_flag = 'muxer' if 'E' in ffmpeg_format['type'] else 'demuxer'
format_detail_raw = subprocess.check_output( format_detail_raw = subprocess.check_output(
[self.renderer_path(), '-hide_banner', '-h', f"{muxer_flag}={ffmpeg_format['id']}"]).decode('utf-8') [self.renderer_path(), '-hide_banner', '-h', f"{muxer_flag}={ffmpeg_format['id']}"],
creationflags=_creationflags).decode('utf-8')
pattern = r"Common extensions: (\w+)" pattern = r"Common extensions: (\w+)"
common_extensions = re.findall(pattern, format_detail_raw) common_extensions = re.findall(pattern, format_detail_raw)
found_extensions = [] found_extensions = []
@@ -116,15 +123,16 @@ class FFMPEG(BaseRenderEngine):
def get_frame_count(self, path_to_file): def get_frame_count(self, path_to_file):
raw_stdout = subprocess.check_output([self.renderer_path(), '-i', path_to_file, '-map', '0:v:0', '-c', 'copy', raw_stdout = subprocess.check_output([self.renderer_path(), '-i', path_to_file, '-map', '0:v:0', '-c', 'copy',
'-f', 'null', '-'], stderr=subprocess.STDOUT, '-f', 'null', '-'], stderr=subprocess.STDOUT,
timeout=SUBPROCESS_TIMEOUT).decode('utf-8') timeout=SUBPROCESS_TIMEOUT, creationflags=_creationflags).decode('utf-8')
match = re.findall(r'frame=\s*(\d+)', raw_stdout) match = re.findall(r'frame=\s*(\d+)', raw_stdout)
if match: if match:
frame_number = int(match[-1]) frame_number = int(match[-1])
return frame_number return frame_number
return -1
def get_arguments(self): def get_arguments(self):
help_text = (subprocess.check_output([self.renderer_path(), '-h', 'long'], stderr=subprocess.STDOUT) help_text = (subprocess.check_output([self.renderer_path(), '-h', 'long'], stderr=subprocess.STDOUT,
.decode('utf-8')) creationflags=_creationflags).decode('utf-8'))
lines = help_text.splitlines() lines = help_text.splitlines()
options = {} options = {}
+5
View File
@@ -0,0 +1,5 @@
class FFMPEGUI:
@staticmethod
def get_options(system_info):
options = []
return options
+2 -3
View File
@@ -1,6 +1,5 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import re import re
import subprocess
from src.engines.core.base_worker import BaseRenderWorker from src.engines.core.base_worker import BaseRenderWorker
from src.engines.ffmpeg.ffmpeg_engine import FFMPEG from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
@@ -17,7 +16,7 @@ class FFMPEGRenderWorker(BaseRenderWorker):
def generate_worker_subprocess(self): def generate_worker_subprocess(self):
cmd = [self.engine.default_renderer_path(), '-y', '-stats', '-i', self.input_path] cmd = [self.renderer_path, '-y', '-stats', '-i', self.input_path]
# Resize frame # Resize frame
if self.args.get('x_resolution', None) and self.args.get('y_resolution', None): if self.args.get('x_resolution', None) and self.args.get('y_resolution', None):
@@ -29,7 +28,7 @@ class FFMPEGRenderWorker(BaseRenderWorker):
cmd.extend(raw_args.split(' ')) cmd.extend(raw_args.split(' '))
# Close with output path # Close with output path
cmd.append(self.output_path) cmd.extend(['-max_muxing_queue_size', '1024', self.output_path])
return cmd return cmd
def percent_complete(self): def percent_complete(self):
+155 -28
View File
@@ -1,59 +1,161 @@
''' app/init.py '''
import logging import logging
import multiprocessing
import os import os
import socket
import sys import sys
import threading import threading
from collections import deque from collections import deque
from datetime import datetime
from PyQt6.QtCore import QObject, pyqtSignal from PyQt6.QtCore import QSettings
from PyQt6.QtWidgets import QApplication
from .render_queue import RenderQueue
from .ui.main_window import MainWindow
from src.api.api_server import start_server from src.api.api_server import start_server
from src.api.preview_manager import PreviewManager
from src.api.serverproxy_manager import ServerProxyManager
from src.distributed_job_manager import DistributedJobManager
from src.engines.engine_manager import EngineManager
from src.render_queue import RenderQueue
from src.utilities.config import Config from src.utilities.config import Config
from src.utilities.misc_helper import system_safe_path from src.utilities.misc_helper import (system_safe_path, current_system_cpu, current_system_os,
current_system_os_version, check_for_updates)
from src.utilities.zeroconf_server import ZeroconfServer
from version import APP_NAME, APP_VERSION, APP_REPO_NAME, APP_REPO_OWNER, APP_AUTHOR
logger = logging.getLogger()
def run() -> int: def run(server_only=False) -> int:
""" """Initializes the application and runs it.
Initializes the application and runs it.
Args:
server_only: Run in server-only CLI mode. Default is False (runs in GUI mode).
Returns: Returns:
int: The exit status code. int: The exit status code.
""" """
# Load Config YAML def existing_process(process_name):
config_dir = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))), 'config') import psutil
Config.load_config(system_safe_path(os.path.join(config_dir, 'config.yaml'))) current_pid = os.getpid()
current_process = psutil.Process(current_pid)
for proc in psutil.process_iter(['pid', 'name', 'ppid']):
proc_name = proc.info['name'].lower().rstrip('.exe')
if proc_name == process_name.lower() and proc.info['pid'] != current_pid:
if proc.info['pid'] == current_process.ppid():
continue # parent process
elif proc.info['ppid'] == current_pid:
continue # child process
else:
return proc # unrelated process
return None
# setup logging
logging.basicConfig(format='%(asctime)s: %(levelname)s: %(module)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S', logging.basicConfig(format='%(asctime)s: %(levelname)s: %(module)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S',
level=Config.server_log_level.upper()) level=Config.server_log_level.upper())
logging.getLogger("requests").setLevel(logging.WARNING) # suppress noisy requests/urllib3 logging
logging.getLogger("urllib3").setLevel(logging.WARNING)
app: QApplication = QApplication(sys.argv) # check for existing instance
existing_proc = existing_process(APP_NAME)
# Start server in background if existing_proc:
background_server = threading.Thread(target=start_server) logger.fatal(f"Another instance of {APP_NAME} is already running (pid: {existing_proc.pid})")
background_server.daemon = True sys.exit(1)
background_server.start()
# Setup logging for console ui # Setup logging for console ui
buffer_handler = BufferingHandler() buffer_handler = __setup_buffer_handler() if not server_only else None
buffer_handler.setFormatter(logging.getLogger().handlers[0].formatter)
logger = logging.getLogger()
logger.addHandler(buffer_handler)
window: MainWindow = MainWindow() # check for updates
window.buffer_handler = buffer_handler update_thread = threading.Thread(target=check_for_updates, args=(APP_REPO_NAME, APP_REPO_OWNER, APP_NAME,
window.show() APP_VERSION))
update_thread.start()
return_code = app.exec() settings = QSettings(APP_AUTHOR, APP_NAME)
# main start
logger.info(f"Starting {APP_NAME} Render Server")
return_code = 0
try:
# Load Config YAML
Config.setup_config_dir()
Config.load_config(system_safe_path(os.path.join(Config.config_dir(), 'config.yaml')))
# configure default paths
EngineManager.engines_path = system_safe_path(
os.path.join(os.path.join(os.path.expanduser(Config.upload_folder),
'engines')))
os.makedirs(EngineManager.engines_path, exist_ok=True)
PreviewManager.storage_path = system_safe_path(
os.path.join(os.path.expanduser(Config.upload_folder), 'previews'))
# Debug info
logger.debug(f"Upload directory: {os.path.expanduser(Config.upload_folder)}")
logger.debug(f"Thumbs directory: {PreviewManager.storage_path}")
logger.debug(f"Engines directory: {EngineManager.engines_path}")
# Set up the RenderQueue object
RenderQueue.load_state(database_directory=system_safe_path(os.path.expanduser(Config.upload_folder)))
ServerProxyManager.subscribe_to_listener()
DistributedJobManager.subscribe_to_listener()
# check for updates for render engines if configured
ignore_system = settings.value("engines_ignore_system_installs", False)
if settings.value('check_for_engine_updates_on_launch', False):
for engine in EngineManager.downloadable_engines():
if settings.value(f'engine_download-{engine.name()}', False):
update_result = EngineManager.is_engine_update_available(engine, ignore_system_installs=ignore_system)
EngineManager.download_engine(engine=engine.name(), version=update_result['version'],
background=True,
ignore_system=ignore_system)
settings.setValue("engines_last_update_time", datetime.now().isoformat())
# get hostname
local_hostname = socket.gethostname()
local_hostname = local_hostname + (".local" if not local_hostname.endswith(".local") else "")
# configure and start API server
api_server = threading.Thread(target=start_server, args=(local_hostname,))
api_server.daemon = True
api_server.start()
# start zeroconf server
ZeroconfServer.configure(f"_{APP_NAME.lower()}._tcp.local.", local_hostname, Config.port_number)
ZeroconfServer.properties = {'system_cpu': current_system_cpu(),
'system_cpu_cores': multiprocessing.cpu_count(),
'system_os': current_system_os(),
'system_os_version': current_system_os_version()}
ZeroconfServer.start()
logger.info(f"{APP_NAME} Render Server started - Hostname: {local_hostname}")
RenderQueue.start() # Start evaluating the render queue
# start in gui or server only (cli) mode
logger.debug(f"Launching in {'server only' if server_only else 'GUI'} mode")
if server_only: # CLI only
api_server.join()
else: # GUI
return_code = __show_gui(buffer_handler)
except KeyboardInterrupt:
pass
except Exception as e:
logging.error(f"Unhandled exception: {e}")
return_code = 1
finally:
# shut down gracefully
logger.info(f"{APP_NAME} Render Server is preparing to shut down")
try:
RenderQueue.prepare_for_shutdown() RenderQueue.prepare_for_shutdown()
except Exception as e:
logger.exception(f"Exception during prepare for shutdown: {e}")
ZeroconfServer.stop()
logger.info(f"{APP_NAME} Render Server has shut down")
return sys.exit(return_code) return sys.exit(return_code)
class BufferingHandler(logging.Handler, QObject): def __setup_buffer_handler():
# lazy load GUI frameworks
from PyQt6.QtCore import QObject, pyqtSignal
class BufferingHandler(logging.Handler, QObject):
new_record = pyqtSignal(str) new_record = pyqtSignal(str)
def __init__(self, capacity=100): def __init__(self, capacity=100):
@@ -62,9 +164,34 @@ class BufferingHandler(logging.Handler, QObject):
self.buffer = deque(maxlen=capacity) # Define a buffer with a fixed capacity self.buffer = deque(maxlen=capacity) # Define a buffer with a fixed capacity
def emit(self, record): def emit(self, record):
try:
msg = self.format(record) msg = self.format(record)
self.buffer.append(msg) # Add message to the buffer self.buffer.append(msg) # Add message to the buffer
self.new_record.emit(msg) # Emit signal self.new_record.emit(msg) # Emit signal
except RuntimeError:
pass
def get_buffer(self): def get_buffer(self):
return list(self.buffer) # Return a copy of the buffer return list(self.buffer) # Return a copy of the buffer
buffer_handler = BufferingHandler()
buffer_handler.setFormatter(logging.getLogger().handlers[0].formatter)
new_logger = logging.getLogger()
new_logger.addHandler(buffer_handler)
return buffer_handler
def __show_gui(buffer_handler):
# lazy load GUI frameworks
from PyQt6.QtWidgets import QApplication
# load application
app: QApplication = QApplication(sys.argv)
# configure main window
from src.ui.main_window import MainWindow
window: MainWindow = MainWindow()
window.buffer_handler = buffer_handler
window.show()
return app.exec()
+118 -69
View File
@@ -2,12 +2,13 @@ import logging
import os import os
from datetime import datetime from datetime import datetime
from pubsub import pub
from sqlalchemy import create_engine from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm.exc import DetachedInstanceError
from src.utilities.status_utils import RenderStatus
from src.engines.engine_manager import EngineManager
from src.engines.core.base_worker import Base from src.engines.core.base_worker import Base
from src.utilities.status_utils import RenderStatus
logger = logging.getLogger() logger = logging.getLogger()
@@ -17,6 +18,9 @@ class JobNotFoundError(Exception):
super().__init__(args) super().__init__(args)
self.job_id = job_id self.job_id = job_id
def __str__(self):
return f"Cannot find job with ID: {self.job_id}"
class RenderQueue: class RenderQueue:
engine = None engine = None
@@ -24,18 +28,53 @@ class RenderQueue:
job_queue = [] job_queue = []
maximum_renderer_instances = {'blender': 1, 'aerender': 1, 'ffmpeg': 4} maximum_renderer_instances = {'blender': 1, 'aerender': 1, 'ffmpeg': 4}
last_saved_counts = {} last_saved_counts = {}
is_running = False
def __init__(self): # --------------------------------------------
# Render Queue Evaluation:
# --------------------------------------------
@classmethod
def start(cls):
"""Start evaluating the render queue"""
logger.debug("Starting render queue updates")
cls.is_running = True
cls.evaluate_queue()
@classmethod
def evaluate_queue(cls):
try:
not_started = cls.jobs_with_status(RenderStatus.NOT_STARTED, priority_sorted=True)
for job in not_started:
if cls.is_available_for_job(job.renderer, job.priority):
cls.start_job(job)
scheduled = cls.jobs_with_status(RenderStatus.SCHEDULED, priority_sorted=True)
for job in scheduled:
if job.scheduled_start <= datetime.now():
logger.debug(f"Starting scheduled job: {job}")
cls.start_job(job)
if cls.last_saved_counts != cls.job_counts():
cls.save_state()
except DetachedInstanceError:
pass pass
@classmethod @classmethod
def add_to_render_queue(cls, render_job, force_start=False): def __local_job_status_changed(cls, job_id, old_status, new_status):
logger.debug('Adding priority {} job to render queue: {}'.format(render_job.priority, render_job)) render_job = RenderQueue.job_with_id(job_id, none_ok=True)
cls.job_queue.append(render_job) if render_job and cls.is_running: # ignore changes from render jobs not in the queue yet
if force_start: logger.debug(f"RenderQueue detected job {job_id} has changed from {old_status} -> {new_status}")
cls.start_job(render_job) RenderQueue.evaluate_queue()
cls.session.add(render_job)
cls.save_state() @classmethod
def stop(cls):
logger.debug("Stopping render queue updates")
cls.is_running = False
# --------------------------------------------
# Fetch Jobs:
# --------------------------------------------
@classmethod @classmethod
def all_jobs(cls): def all_jobs(cls):
@@ -66,12 +105,15 @@ class RenderQueue:
return found_job return found_job
@classmethod @classmethod
def clear_history(cls): def job_counts(cls):
to_remove = [x for x in cls.all_jobs() if x.status in [RenderStatus.CANCELLED, job_counts = {}
RenderStatus.COMPLETED, RenderStatus.ERROR]] for job_status in RenderStatus:
for job_to_remove in to_remove: job_counts[job_status.value] = len(cls.jobs_with_status(job_status))
cls.delete_job(job_to_remove) return job_counts
cls.save_state()
# --------------------------------------------
# Startup / Shutdown:
# --------------------------------------------
@classmethod @classmethod
def load_state(cls, database_directory): def load_state(cls, database_directory):
@@ -81,6 +123,7 @@ class RenderQueue:
cls.session = sessionmaker(bind=cls.engine)() cls.session = sessionmaker(bind=cls.engine)()
from src.engines.core.base_worker import BaseRenderWorker from src.engines.core.base_worker import BaseRenderWorker
cls.job_queue = cls.session.query(BaseRenderWorker).all() cls.job_queue = cls.session.query(BaseRenderWorker).all()
pub.subscribe(cls.__local_job_status_changed, 'status_change')
@classmethod @classmethod
def save_state(cls): def save_state(cls):
@@ -89,59 +132,15 @@ class RenderQueue:
@classmethod @classmethod
def prepare_for_shutdown(cls): def prepare_for_shutdown(cls):
logger.debug("Closing session") logger.debug("Closing session")
cls.stop()
running_jobs = cls.jobs_with_status(RenderStatus.RUNNING) # cancel all running jobs running_jobs = cls.jobs_with_status(RenderStatus.RUNNING) # cancel all running jobs
[cls.cancel_job(job) for job in running_jobs] [cls.cancel_job(job) for job in running_jobs]
cls.save_state() cls.save_state()
cls.session.close() cls.session.close()
@classmethod # --------------------------------------------
def is_available_for_job(cls, renderer, priority=2): # Renderer Availability:
# --------------------------------------------
if not EngineManager.all_versions_for_engine(renderer):
return False
instances = cls.renderer_instances()
higher_priority_jobs = [x for x in cls.running_jobs() if x.priority < priority]
max_allowed_instances = cls.maximum_renderer_instances.get(renderer, 1)
maxed_out_instances = renderer in instances.keys() and instances[renderer] >= max_allowed_instances
return not maxed_out_instances and not higher_priority_jobs
@classmethod
def evaluate_queue(cls):
not_started = cls.jobs_with_status(RenderStatus.NOT_STARTED, priority_sorted=True)
for job in not_started:
if cls.is_available_for_job(job.renderer, job.priority):
cls.start_job(job)
scheduled = cls.jobs_with_status(RenderStatus.SCHEDULED, priority_sorted=True)
for job in scheduled:
if job.scheduled_start <= datetime.now():
logger.debug(f"Starting scheduled job: {job}")
cls.start_job(job)
if cls.last_saved_counts != cls.job_counts():
cls.save_state()
@classmethod
def start_job(cls, job):
logger.info(f'Starting render: {job.name} - Priority {job.priority}')
job.start()
cls.save_state()
@classmethod
def cancel_job(cls, job):
logger.info(f'Cancelling job ID: {job.id}')
job.stop()
return job.status == RenderStatus.CANCELLED
@classmethod
def delete_job(cls, job):
logger.info(f"Deleting job ID: {job.id}")
job.stop()
cls.job_queue.remove(job)
cls.session.delete(job)
cls.save_state()
return True
@classmethod @classmethod
def renderer_instances(cls): def renderer_instances(cls):
@@ -150,8 +149,58 @@ class RenderQueue:
return Counter(all_instances) return Counter(all_instances)
@classmethod @classmethod
def job_counts(cls): def is_available_for_job(cls, renderer, priority=2):
job_counts = {}
for job_status in RenderStatus: instances = cls.renderer_instances()
job_counts[job_status.value] = len(cls.jobs_with_status(job_status)) higher_priority_jobs = [x for x in cls.running_jobs() if x.priority < priority]
return job_counts max_allowed_instances = cls.maximum_renderer_instances.get(renderer, 1)
maxed_out_instances = renderer in instances.keys() and instances[renderer] >= max_allowed_instances
return not maxed_out_instances and not higher_priority_jobs
# --------------------------------------------
# Job Lifecycle Management:
# --------------------------------------------
@classmethod
def add_to_render_queue(cls, render_job, force_start=False):
logger.info(f"Adding job to render queue: {render_job}")
cls.job_queue.append(render_job)
if cls.is_running and force_start and render_job.status in (RenderStatus.NOT_STARTED, RenderStatus.SCHEDULED):
cls.start_job(render_job)
cls.session.add(render_job)
cls.save_state()
if cls.is_running:
cls.evaluate_queue()
@classmethod
def start_job(cls, job):
logger.info(f'Starting job: {job}')
job.start()
cls.save_state()
@classmethod
def cancel_job(cls, job):
logger.info(f'Cancelling job: {job}')
job.stop()
return job.status == RenderStatus.CANCELLED
@classmethod
def delete_job(cls, job):
logger.info(f"Deleting job: {job}")
job.stop()
cls.job_queue.remove(job)
cls.session.delete(job)
cls.save_state()
return True
# --------------------------------------------
# Miscellaneous:
# --------------------------------------------
@classmethod
def clear_history(cls):
to_remove = [x for x in cls.all_jobs() if x.status in [RenderStatus.CANCELLED,
RenderStatus.COMPLETED, RenderStatus.ERROR]]
for job_to_remove in to_remove:
cls.delete_job(job_to_remove)
cls.save_state()
+74
View File
@@ -0,0 +1,74 @@
import os
import sys
from PyQt6.QtCore import Qt
from PyQt6.QtGui import QPixmap
from PyQt6.QtWidgets import QDialog, QVBoxLayout, QLabel, QDialogButtonBox, QHBoxLayout
from version import *
class AboutDialog(QDialog):
def __init__(self):
super().__init__()
self.setWindowTitle(f"About {APP_NAME}")
# Create the layout
layout = QVBoxLayout()
# App Icon
icon_name = 'Server.png' # todo: temp icon - replace with final later
icon_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))),
'resources', icon_name)
icon_label = QLabel(self)
icon_pixmap = QPixmap(icon_path)
icon_label.setPixmap(icon_pixmap)
icon_layout = QHBoxLayout()
icon_layout.addStretch()
icon_layout.addWidget(icon_label)
icon_layout.addStretch()
layout.addLayout(icon_layout)
# Application name
name_label = QLabel(f"<h2>{APP_NAME}</h2>")
layout.addWidget(name_label)
# Description
description_label = QLabel(APP_DESCRIPTION)
layout.addWidget(description_label)
# Version
version_label = QLabel(f"<strong>Version:</strong> {APP_VERSION}")
layout.addWidget(version_label)
# Contributors
contributors_label = QLabel(f"Copyright © {APP_COPYRIGHT_YEAR} {APP_AUTHOR}")
layout.addWidget(contributors_label)
# License
license_label = QLabel(f"Released under {APP_LICENSE}")
layout.addWidget(license_label)
# Add an "OK" button to close the dialog
button_box = QDialogButtonBox(QDialogButtonBox.StandardButton.Ok)
button_box.accepted.connect(self.accept)
layout.addWidget(button_box)
# Set the layout for the dialog
self.setLayout(layout)
# Make the dialog non-resizable
self.setWindowFlags(self.windowFlags() & ~Qt.WindowType.WindowContextHelpButtonHint)
self.setFixedSize(self.sizeHint())
if __name__ == '__main__':
# lazy load GUI frameworks
from PyQt6.QtWidgets import QApplication
# load application
app: QApplication = QApplication(sys.argv)
window: AboutDialog = AboutDialog()
window.show()
app.exec()
+68 -49
View File
@@ -21,10 +21,17 @@ from src.utilities.zeroconf_server import ZeroconfServer
class NewRenderJobForm(QWidget): class NewRenderJobForm(QWidget):
def __init__(self, project_path=None): def __init__(self, project_path=None):
super().__init__() super().__init__()
self.notes_group = None
self.frame_rate_input = None
self.resolution_x_input = None
self.renderer_group = None
self.output_settings_group = None
self.resolution_y_input = None
self.project_path = project_path self.project_path = project_path
# UI # UI
self.project_group = None
self.load_file_group = None
self.current_engine_options = None self.current_engine_options = None
self.file_format_combo = None self.file_format_combo = None
self.renderer_options_layout = None self.renderer_options_layout = None
@@ -48,7 +55,7 @@ class NewRenderJobForm(QWidget):
self.priority_input = None self.priority_input = None
self.end_frame_input = None self.end_frame_input = None
self.start_frame_input = None self.start_frame_input = None
self.output_path_input = None self.render_name_input = None
self.scene_file_input = None self.scene_file_input = None
self.scene_file_browse_button = None self.scene_file_browse_button = None
self.job_name_input = None self.job_name_input = None
@@ -61,11 +68,12 @@ class NewRenderJobForm(QWidget):
# Setup # Setup
self.setWindowTitle("New Job") self.setWindowTitle("New Job")
self.setup_ui() self.setup_ui()
self.update_renderer_info()
self.setup_project() self.setup_project()
# get renderer info in bg thread # get renderer info in bg thread
t = threading.Thread(target=self.update_renderer_info) # t = threading.Thread(target=self.update_renderer_info)
t.start() # t.start()
self.show() self.show()
@@ -73,41 +81,41 @@ class NewRenderJobForm(QWidget):
# Main Layout # Main Layout
main_layout = QVBoxLayout(self) main_layout = QVBoxLayout(self)
# Scene File Group # Loading File Group
scene_file_group = QGroupBox("Project") self.load_file_group = QGroupBox("Loading")
scene_file_layout = QVBoxLayout(scene_file_group) load_file_layout = QVBoxLayout(self.load_file_group)
scene_file_picker_layout = QHBoxLayout()
self.scene_file_input = QLineEdit()
self.scene_file_input.setText(self.project_path)
self.scene_file_browse_button = QPushButton("Browse...")
self.scene_file_browse_button.clicked.connect(self.browse_scene_file)
scene_file_picker_layout.addWidget(self.scene_file_input)
scene_file_picker_layout.addWidget(self.scene_file_browse_button)
scene_file_layout.addLayout(scene_file_picker_layout)
# progress bar # progress bar
progress_layout = QHBoxLayout() progress_layout = QHBoxLayout()
self.process_progress_bar = QProgressBar() self.process_progress_bar = QProgressBar()
self.process_progress_bar.setMinimum(0) self.process_progress_bar.setMinimum(0)
self.process_progress_bar.setMaximum(0) self.process_progress_bar.setMaximum(0)
self.process_progress_bar.setHidden(True)
self.process_label = QLabel("Processing") self.process_label = QLabel("Processing")
self.process_label.setHidden(True)
progress_layout.addWidget(self.process_label) progress_layout.addWidget(self.process_label)
progress_layout.addWidget(self.process_progress_bar) progress_layout.addWidget(self.process_progress_bar)
scene_file_layout.addLayout(progress_layout) load_file_layout.addLayout(progress_layout)
main_layout.addWidget(scene_file_group) main_layout.addWidget(self.load_file_group)
# Server Group # Project Group
self.project_group = QGroupBox("Project")
server_layout = QVBoxLayout(self.project_group)
# File Path
scene_file_picker_layout = QHBoxLayout()
self.scene_file_input = QLineEdit()
self.scene_file_input.setText(self.project_path)
self.scene_file_browse_button = QPushButton("Browse...")
self.scene_file_browse_button.clicked.connect(self.browse_scene_file)
scene_file_picker_layout.addWidget(QLabel("File:"))
scene_file_picker_layout.addWidget(self.scene_file_input)
scene_file_picker_layout.addWidget(self.scene_file_browse_button)
server_layout.addLayout(scene_file_picker_layout)
# Server List # Server List
self.server_group = QGroupBox("Server")
server_layout = QVBoxLayout(self.server_group)
server_list_layout = QHBoxLayout() server_list_layout = QHBoxLayout()
server_list_layout.setSpacing(0) server_list_layout.setSpacing(0)
self.server_input = QComboBox() self.server_input = QComboBox()
server_list_layout.addWidget(QLabel("Hostname:"), 1) server_list_layout.addWidget(QLabel("Hostname:"), 1)
server_list_layout.addWidget(self.server_input, 3) server_list_layout.addWidget(self.server_input, 3)
server_layout.addLayout(server_list_layout) server_layout.addLayout(server_list_layout)
main_layout.addWidget(self.server_group) main_layout.addWidget(self.project_group)
self.update_server_list() self.update_server_list()
# Priority # Priority
priority_layout = QHBoxLayout() priority_layout = QHBoxLayout()
@@ -129,11 +137,11 @@ class NewRenderJobForm(QWidget):
self.output_settings_group = QGroupBox("Output Settings") self.output_settings_group = QGroupBox("Output Settings")
output_settings_layout = QVBoxLayout(self.output_settings_group) output_settings_layout = QVBoxLayout(self.output_settings_group)
# output path # output path
output_path_layout = QHBoxLayout() render_name_layout = QHBoxLayout()
output_path_layout.addWidget(QLabel("Render name:")) render_name_layout.addWidget(QLabel("Render name:"))
self.output_path_input = QLineEdit() self.render_name_input = QLineEdit()
output_path_layout.addWidget(self.output_path_input) render_name_layout.addWidget(self.render_name_input)
output_settings_layout.addLayout(output_path_layout) output_settings_layout.addLayout(render_name_layout)
# file format # file format
file_format_layout = QHBoxLayout() file_format_layout = QHBoxLayout()
file_format_layout.addWidget(QLabel("Format:")) file_format_layout.addWidget(QLabel("Format:"))
@@ -185,6 +193,7 @@ class NewRenderJobForm(QWidget):
# Version # Version
renderer_layout.addWidget(QLabel("Version:")) renderer_layout.addWidget(QLabel("Version:"))
self.renderer_version_combo = QComboBox() self.renderer_version_combo = QComboBox()
self.renderer_version_combo.addItem('latest')
renderer_layout.addWidget(self.renderer_version_combo) renderer_layout.addWidget(self.renderer_version_combo)
renderer_group_layout.addLayout(renderer_layout) renderer_group_layout.addLayout(renderer_layout)
# dynamic options # dynamic options
@@ -235,7 +244,7 @@ class NewRenderJobForm(QWidget):
def update_renderer_info(self): def update_renderer_info(self):
# get the renderer info and add them all to the ui # get the renderer info and add them all to the ui
self.renderer_info = self.server_proxy.get_renderer_info() self.renderer_info = self.server_proxy.get_renderer_info(response_type='full')
self.renderer_type.addItems(self.renderer_info.keys()) self.renderer_type.addItems(self.renderer_info.keys())
# select the best renderer for the file type # select the best renderer for the file type
engine = EngineManager.engine_for_project_path(self.project_path) engine = EngineManager.engine_for_project_path(self.project_path)
@@ -247,6 +256,7 @@ class NewRenderJobForm(QWidget):
# load the version numbers # load the version numbers
current_renderer = self.renderer_type.currentText().lower() or self.renderer_type.itemText(0) current_renderer = self.renderer_type.currentText().lower() or self.renderer_type.itemText(0)
self.renderer_version_combo.clear() self.renderer_version_combo.clear()
self.renderer_version_combo.addItem('latest')
self.file_format_combo.clear() self.file_format_combo.clear()
if current_renderer: if current_renderer:
renderer_vers = [version_info['version'] for version_info in self.renderer_info[current_renderer]['versions']] renderer_vers = [version_info['version'] for version_info in self.renderer_info[current_renderer]['versions']]
@@ -272,7 +282,7 @@ class NewRenderJobForm(QWidget):
output_name, _ = os.path.splitext(os.path.basename(self.scene_file_input.text())) output_name, _ = os.path.splitext(os.path.basename(self.scene_file_input.text()))
output_name = output_name.replace(' ', '_') output_name = output_name.replace(' ', '_')
self.output_path_input.setText(output_name) self.render_name_input.setText(output_name)
file_name = self.scene_file_input.text() file_name = self.scene_file_input.text()
# setup bg worker # setup bg worker
@@ -283,7 +293,7 @@ class NewRenderJobForm(QWidget):
def browse_output_path(self): def browse_output_path(self):
directory = QFileDialog.getExistingDirectory(self, "Select Output Directory") directory = QFileDialog.getExistingDirectory(self, "Select Output Directory")
if directory: if directory:
self.output_path_input.setText(directory) self.render_name_input.setText(directory)
def args_help_button_clicked(self): def args_help_button_clicked(self):
url = (f'http://{self.server_proxy.hostname}:{self.server_proxy.port}/api/renderer/' url = (f'http://{self.server_proxy.hostname}:{self.server_proxy.port}/api/renderer/'
@@ -307,11 +317,8 @@ class NewRenderJobForm(QWidget):
self.renderer_type.setCurrentIndex(0) #todo: find out why we don't have renderer info yet self.renderer_type.setCurrentIndex(0) #todo: find out why we don't have renderer info yet
# not ideal but if we don't have the renderer info we have to pick something # not ideal but if we don't have the renderer info we have to pick something
self.output_path_input.setText(os.path.basename(input_path))
# cleanup progress UI # cleanup progress UI
self.process_progress_bar.setHidden(True) self.load_file_group.setHidden(True)
self.process_label.setHidden(True)
self.toggle_renderer_enablement(True) self.toggle_renderer_enablement(True)
# Load scene data # Load scene data
@@ -342,10 +349,11 @@ class NewRenderJobForm(QWidget):
# Dynamic Engine Options # Dynamic Engine Options
clear_layout(self.renderer_options_layout) # clear old options clear_layout(self.renderer_options_layout) # clear old options
# dynamically populate option list # dynamically populate option list
self.current_engine_options = engine().get_options() system_info = self.renderer_info.get(engine.name(), {}).get('system_info', {})
self.current_engine_options = engine.ui_options(system_info=system_info)
for option in self.current_engine_options: for option in self.current_engine_options:
h_layout = QHBoxLayout() h_layout = QHBoxLayout()
label = QLabel(option['name'].capitalize() + ':') label = QLabel(option['name'].replace('_', ' ').capitalize() + ':')
h_layout.addWidget(label) h_layout.addWidget(label)
if option.get('options'): if option.get('options'):
combo_box = QComboBox() combo_box = QComboBox()
@@ -356,12 +364,12 @@ class NewRenderJobForm(QWidget):
text_box = QLineEdit() text_box = QLineEdit()
h_layout.addWidget(text_box) h_layout.addWidget(text_box)
self.renderer_options_layout.addLayout(h_layout) self.renderer_options_layout.addLayout(h_layout)
except AttributeError as e: except AttributeError:
pass pass
def toggle_renderer_enablement(self, enabled=False): def toggle_renderer_enablement(self, enabled=False):
"""Toggle on/off all the render settings""" """Toggle on/off all the render settings"""
self.server_group.setHidden(not enabled) self.project_group.setHidden(not enabled)
self.output_settings_group.setHidden(not enabled) self.output_settings_group.setHidden(not enabled)
self.renderer_group.setHidden(not enabled) self.renderer_group.setHidden(not enabled)
self.notes_group.setHidden(not enabled) self.notes_group.setHidden(not enabled)
@@ -369,7 +377,7 @@ class NewRenderJobForm(QWidget):
self.cameras_group.setHidden(True) self.cameras_group.setHidden(True)
self.submit_button.setEnabled(enabled) self.submit_button.setEnabled(enabled)
def after_job_submission(self, result): def after_job_submission(self, error_string):
# UI cleanup # UI cleanup
self.submit_progress.setMaximum(0) self.submit_progress.setMaximum(0)
@@ -381,7 +389,7 @@ class NewRenderJobForm(QWidget):
self.toggle_renderer_enablement(True) self.toggle_renderer_enablement(True)
self.msg_box = QMessageBox() self.msg_box = QMessageBox()
if result.ok: if not error_string:
self.msg_box.setStandardButtons(QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No) self.msg_box.setStandardButtons(QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No)
self.msg_box.setIcon(QMessageBox.Icon.Information) self.msg_box.setIcon(QMessageBox.Icon.Information)
self.msg_box.setText("Job successfully submitted to server. Submit another?") self.msg_box.setText("Job successfully submitted to server. Submit another?")
@@ -392,7 +400,7 @@ class NewRenderJobForm(QWidget):
else: else:
self.msg_box.setStandardButtons(QMessageBox.StandardButton.Ok) self.msg_box.setStandardButtons(QMessageBox.StandardButton.Ok)
self.msg_box.setIcon(QMessageBox.Icon.Critical) self.msg_box.setIcon(QMessageBox.Icon.Critical)
self.msg_box.setText(result.text or "Unknown error") self.msg_box.setText(error_string)
self.msg_box.setWindowTitle("Error") self.msg_box.setWindowTitle("Error")
self.msg_box.exec() self.msg_box.exec()
@@ -423,7 +431,7 @@ class NewRenderJobForm(QWidget):
class SubmitWorker(QThread): class SubmitWorker(QThread):
"""Worker class called to submit all the jobs to the server and update the UI accordingly""" """Worker class called to submit all the jobs to the server and update the UI accordingly"""
message_signal = pyqtSignal(Response) message_signal = pyqtSignal(str)
update_ui_signal = pyqtSignal(str, str) update_ui_signal = pyqtSignal(str, str)
def __init__(self, window): def __init__(self, window):
@@ -439,18 +447,21 @@ class SubmitWorker(QThread):
self.update_ui_signal.emit(hostname, percent) self.update_ui_signal.emit(hostname, percent)
return callback return callback
try:
hostname = self.window.server_input.currentText() hostname = self.window.server_input.currentText()
job_json = {'owner': psutil.Process().username() + '@' + socket.gethostname(), job_json = {'owner': psutil.Process().username() + '@' + socket.gethostname(),
'renderer': self.window.renderer_type.currentText().lower(), 'renderer': self.window.renderer_type.currentText().lower(),
'renderer_version': self.window.renderer_version_combo.currentText(), 'engine_version': self.window.renderer_version_combo.currentText(),
'args': {'raw': self.window.raw_args.text()}, 'args': {'raw': self.window.raw_args.text(),
'output_path': self.window.output_path_input.text(), 'export_format': self.window.file_format_combo.currentText()},
'output_path': self.window.render_name_input.text(),
'start_frame': self.window.start_frame_input.value(), 'start_frame': self.window.start_frame_input.value(),
'end_frame': self.window.end_frame_input.value(), 'end_frame': self.window.end_frame_input.value(),
'priority': self.window.priority_input.currentIndex() + 1, 'priority': self.window.priority_input.currentIndex() + 1,
'notes': self.window.notes_input.toPlainText(), 'notes': self.window.notes_input.toPlainText(),
'enable_split_jobs': self.window.enable_splitjobs.isChecked(), 'enable_split_jobs': self.window.enable_splitjobs.isChecked(),
'split_jobs_same_os': self.window.splitjobs_same_os.isChecked()} 'split_jobs_same_os': self.window.splitjobs_same_os.isChecked(),
'name': self.window.render_name_input.text()}
# get the dynamic args # get the dynamic args
for i in range(self.window.renderer_options_layout.count()): for i in range(self.window.renderer_options_layout.count()):
@@ -479,7 +490,8 @@ class SubmitWorker(QThread):
for cam in selected_cameras: for cam in selected_cameras:
job_copy = copy.deepcopy(job_json) job_copy = copy.deepcopy(job_json)
job_copy['args']['camera'] = cam job_copy['args']['camera'] = cam
job_copy['name'] = pathlib.Path(input_path).stem.replace(' ', '_') + "-" + cam.replace(' ', '') job_copy['name'] = job_copy['name'].replace(' ', '-') + "_" + cam.replace(' ', '')
job_copy['output_path'] = job_copy['name']
job_list.append(job_copy) job_list.append(job_copy)
else: else:
job_list = [job_json] job_list = [job_json]
@@ -488,9 +500,16 @@ class SubmitWorker(QThread):
engine = EngineManager.engine_with_name(self.window.renderer_type.currentText().lower()) engine = EngineManager.engine_with_name(self.window.renderer_type.currentText().lower())
input_path = engine().perform_presubmission_tasks(input_path) input_path = engine().perform_presubmission_tasks(input_path)
# submit # submit
err_msg = ""
result = self.window.server_proxy.post_job_to_server(file_path=input_path, job_list=job_list, result = self.window.server_proxy.post_job_to_server(file_path=input_path, job_list=job_list,
callback=create_callback) callback=create_callback)
self.message_signal.emit(result) if not (result and result.ok):
err_msg = "Error posting job to server."
self.message_signal.emit(err_msg)
except Exception as e:
self.message_signal.emit(str(e))
class GetProjectInfoWorker(QThread): class GetProjectInfoWorker(QThread):
+3 -1
View File
@@ -1,4 +1,3 @@
import sys
import logging import logging
from PyQt6.QtGui import QFont from PyQt6.QtGui import QFont
@@ -16,7 +15,10 @@ class QSignalHandler(logging.Handler, QObject):
def emit(self, record): def emit(self, record):
msg = self.format(record) msg = self.format(record)
try:
self.new_record.emit(msg) # Emit signal self.new_record.emit(msg) # Emit signal
except RuntimeError:
pass
class ConsoleWindow(QMainWindow): class ConsoleWindow(QMainWindow):
+20 -14
View File
@@ -4,6 +4,7 @@ import subprocess
import sys import sys
import threading import threading
from PyQt6.QtCore import QTimer
from PyQt6.QtWidgets import ( from PyQt6.QtWidgets import (
QMainWindow, QWidget, QVBoxLayout, QPushButton, QTableWidget, QTableWidgetItem, QHBoxLayout, QAbstractItemView, QMainWindow, QWidget, QVBoxLayout, QPushButton, QTableWidget, QTableWidgetItem, QHBoxLayout, QAbstractItemView,
QHeaderView, QProgressBar, QLabel, QMessageBox QHeaderView, QProgressBar, QLabel, QMessageBox
@@ -11,7 +12,7 @@ from PyQt6.QtWidgets import (
from src.api.server_proxy import RenderServerProxy from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager from src.engines.engine_manager import EngineManager
from src.utilities.misc_helper import is_localhost from src.utilities.misc_helper import is_localhost, launch_url
class EngineBrowserWindow(QMainWindow): class EngineBrowserWindow(QMainWindow):
@@ -28,6 +29,7 @@ class EngineBrowserWindow(QMainWindow):
self.setGeometry(100, 100, 500, 300) self.setGeometry(100, 100, 500, 300)
self.engine_data = [] self.engine_data = []
self.initUI() self.initUI()
self.init_timer()
def initUI(self): def initUI(self):
# Central widget # Central widget
@@ -82,6 +84,12 @@ class EngineBrowserWindow(QMainWindow):
self.update_download_status() self.update_download_status()
def init_timer(self):
# Set up the timer
self.timer = QTimer(self)
self.timer.timeout.connect(self.update_download_status)
self.timer.start(1000)
def update_table(self): def update_table(self):
def update_table_worker(): def update_table_worker():
@@ -90,7 +98,7 @@ class EngineBrowserWindow(QMainWindow):
return return
table_data = [] # convert the data into a flat list table_data = [] # convert the data into a flat list
for engine_name, engine_data in raw_server_data.items(): for _, engine_data in raw_server_data.items():
table_data.extend(engine_data['versions']) table_data.extend(engine_data['versions'])
self.engine_data = table_data self.engine_data = table_data
@@ -124,21 +132,19 @@ class EngineBrowserWindow(QMainWindow):
hide_progress = not bool(running_tasks) hide_progress = not bool(running_tasks)
self.progress_bar.setHidden(hide_progress) self.progress_bar.setHidden(hide_progress)
self.progress_label.setHidden(hide_progress) self.progress_label.setHidden(hide_progress)
# Update the status labels
# todo: update progress bar with status if len(EngineManager.download_tasks) == 0:
self.progress_label.setText(f"Downloading {len(running_tasks)} engines") new_status = ""
elif len(EngineManager.download_tasks) == 1:
task = EngineManager.download_tasks[0]
new_status = f"Downloading {task.engine.capitalize()} {task.version}..."
else:
new_status = f"Downloading {len(EngineManager.download_tasks)} engines..."
self.progress_label.setText(new_status)
def launch_button_click(self): def launch_button_click(self):
engine_info = self.engine_data[self.table_widget.currentRow()] engine_info = self.engine_data[self.table_widget.currentRow()]
path = engine_info['path'] launch_url(engine_info['path'])
if sys.platform.startswith('darwin'):
subprocess.run(['open', path])
elif sys.platform.startswith('win32'):
os.startfile(path)
elif sys.platform.startswith('linux'):
subprocess.run(['xdg-open', path])
else:
raise OSError("Unsupported operating system")
def install_button_click(self): def install_button_click(self):
self.update_download_status() self.update_download_status()
+69 -40
View File
@@ -1,13 +1,14 @@
''' app/ui/main_window.py ''' ''' app/ui/main_window.py '''
import datetime import datetime
import io
import logging import logging
import os import os
import socket
import subprocess import subprocess
import sys import sys
import threading import threading
import time import time
import PIL
from PIL import Image from PIL import Image
from PyQt6.QtCore import Qt, QByteArray, QBuffer, QIODevice, QThread from PyQt6.QtCore import Qt, QByteArray, QBuffer, QIODevice, QThread
from PyQt6.QtGui import QPixmap, QImage, QFont, QIcon from PyQt6.QtGui import QPixmap, QImage, QFont, QIcon
@@ -15,20 +16,20 @@ from PyQt6.QtWidgets import QMainWindow, QWidget, QHBoxLayout, QListWidget, QTab
QTableWidgetItem, QLabel, QVBoxLayout, QHeaderView, QMessageBox, QGroupBox, QPushButton, QListWidgetItem, \ QTableWidgetItem, QLabel, QVBoxLayout, QHeaderView, QMessageBox, QGroupBox, QPushButton, QListWidgetItem, \
QFileDialog QFileDialog
from src.api.server_proxy import RenderServerProxy
from src.render_queue import RenderQueue from src.render_queue import RenderQueue
from src.utilities.misc_helper import get_time_elapsed, resources_dir, is_localhost from src.utilities.misc_helper import get_time_elapsed, resources_dir, is_localhost
from src.utilities.status_utils import RenderStatus from src.utilities.status_utils import RenderStatus
from src.utilities.zeroconf_server import ZeroconfServer from src.utilities.zeroconf_server import ZeroconfServer
from .add_job import NewRenderJobForm from src.ui.add_job import NewRenderJobForm
from .console import ConsoleWindow from src.ui.console import ConsoleWindow
from .engine_browser import EngineBrowserWindow from src.ui.engine_browser import EngineBrowserWindow
from .log_viewer import LogViewer from src.ui.log_viewer import LogViewer
from .widgets.menubar import MenuBar from src.ui.widgets.menubar import MenuBar
from .widgets.proportional_image_label import ProportionalImageLabel from src.ui.widgets.proportional_image_label import ProportionalImageLabel
from .widgets.statusbar import StatusBar from src.ui.widgets.statusbar import StatusBar
from .widgets.toolbar import ToolBar from src.ui.widgets.toolbar import ToolBar
from src.api.serverproxy_manager import ServerProxyManager from src.api.serverproxy_manager import ServerProxyManager
from src.utilities.misc_helper import launch_url
logger = logging.getLogger() logger = logging.getLogger()
@@ -48,6 +49,11 @@ class MainWindow(QMainWindow):
super().__init__() super().__init__()
# Load the queue # Load the queue
self.job_list_view = None
self.server_info_ram = None
self.server_info_cpu = None
self.server_info_os = None
self.server_info_hostname = None
self.engine_browser_window = None self.engine_browser_window = None
self.server_info_group = None self.server_info_group = None
self.current_hostname = None self.current_hostname = None
@@ -67,7 +73,7 @@ class MainWindow(QMainWindow):
# Create a QLabel widget to display the image # Create a QLabel widget to display the image
self.image_label = ProportionalImageLabel() self.image_label = ProportionalImageLabel()
self.image_label.setMaximumSize(700, 500) self.image_label.setMaximumSize(700, 500)
self.image_label.setFixedHeight(500) self.image_label.setFixedHeight(300)
self.image_label.setAlignment(Qt.AlignmentFlag.AlignTop | Qt.AlignmentFlag.AlignHCenter) self.image_label.setAlignment(Qt.AlignmentFlag.AlignTop | Qt.AlignmentFlag.AlignHCenter)
self.load_image_path(os.path.join(resources_dir(), 'Rectangle.png')) self.load_image_path(os.path.join(resources_dir(), 'Rectangle.png'))
@@ -177,8 +183,13 @@ class MainWindow(QMainWindow):
def __background_update(self): def __background_update(self):
while True: while True:
try:
self.update_servers() self.update_servers()
self.fetch_jobs() self.fetch_jobs()
except RuntimeError:
pass
except Exception as e:
logger.error(f"Uncaught exception in background update: {e}")
time.sleep(0.5) time.sleep(0.5)
def closeEvent(self, event): def closeEvent(self, event):
@@ -278,15 +289,25 @@ class MainWindow(QMainWindow):
def fetch_preview(job_id): def fetch_preview(job_id):
try: try:
default_image_path = "error.png"
before_fetch_hostname = self.current_server_proxy.hostname before_fetch_hostname = self.current_server_proxy.hostname
response = self.current_server_proxy.request(f'job/{job_id}/thumbnail?size=big') response = self.current_server_proxy.request(f'job/{job_id}/thumbnail?size=big')
if response.ok: if response.ok:
import io try:
image_data = response.content with io.BytesIO(response.content) as image_data_stream:
image = Image.open(io.BytesIO(image_data)) image = Image.open(image_data_stream)
if self.current_server_proxy.hostname == before_fetch_hostname and job_id == \ if self.current_server_proxy.hostname == before_fetch_hostname and job_id == \
self.selected_job_ids()[0]: self.selected_job_ids()[0]:
self.load_image_data(image) self.load_image_data(image)
return
except PIL.UnidentifiedImageError:
default_image_path = response.text
else:
default_image_path = default_image_path or response.text
self.load_image_path(os.path.join(resources_dir(), default_image_path))
except ConnectionError as e: except ConnectionError as e:
logger.error(f"Connection error fetching image: {e}") logger.error(f"Connection error fetching image: {e}")
except Exception as e: except Exception as e:
@@ -304,7 +325,7 @@ class MainWindow(QMainWindow):
current_status = self.job_list_view.item(selected_row.row(), 4).text() current_status = self.job_list_view.item(selected_row.row(), 4).text()
# show / hide the stop button # show / hide the stop button
show_stop_button = current_status.lower() == 'running' show_stop_button = "%" in current_status
self.topbar.actions_call['Stop Job'].setEnabled(show_stop_button) self.topbar.actions_call['Stop Job'].setEnabled(show_stop_button)
self.topbar.actions_call['Stop Job'].setVisible(show_stop_button) self.topbar.actions_call['Stop Job'].setVisible(show_stop_button)
self.topbar.actions_call['Delete Job'].setEnabled(not show_stop_button) self.topbar.actions_call['Delete Job'].setEnabled(not show_stop_button)
@@ -329,12 +350,15 @@ class MainWindow(QMainWindow):
self.topbar.actions_call['Open Files'].setVisible(False) self.topbar.actions_call['Open Files'].setVisible(False)
def selected_job_ids(self): def selected_job_ids(self):
try:
selected_rows = self.job_list_view.selectionModel().selectedRows() selected_rows = self.job_list_view.selectionModel().selectedRows()
job_ids = [] job_ids = []
for selected_row in selected_rows: for selected_row in selected_rows:
id_item = self.job_list_view.item(selected_row.row(), 0) id_item = self.job_list_view.item(selected_row.row(), 0)
job_ids.append(id_item.text()) job_ids.append(id_item.text())
return job_ids return job_ids
except AttributeError:
return []
def refresh_job_headers(self): def refresh_job_headers(self):
self.job_list_view.setHorizontalHeaderLabels(["ID", "Name", "Renderer", "Priority", "Status", self.job_list_view.setHorizontalHeaderLabels(["ID", "Name", "Renderer", "Priority", "Status",
@@ -353,13 +377,17 @@ class MainWindow(QMainWindow):
def load_image_path(self, image_path): def load_image_path(self, image_path):
# Load and set the image using QPixmap # Load and set the image using QPixmap
try:
pixmap = QPixmap(image_path) pixmap = QPixmap(image_path)
if not pixmap: if not pixmap:
logger.error("Error loading image") logger.error("Error loading image")
return return
self.image_label.setPixmap(pixmap) self.image_label.setPixmap(pixmap)
except Exception as e:
logger.error(f"Error loading image path: {e}")
def load_image_data(self, pillow_image): def load_image_data(self, pillow_image):
try:
# Convert the Pillow Image to a QByteArray (byte buffer) # Convert the Pillow Image to a QByteArray (byte buffer)
byte_array = QByteArray() byte_array = QByteArray()
buffer = QBuffer(byte_array) buffer = QBuffer(byte_array)
@@ -377,6 +405,8 @@ class MainWindow(QMainWindow):
logger.error("Error loading image") logger.error("Error loading image")
return return
self.image_label.setPixmap(pixmap) self.image_label.setPixmap(pixmap)
except Exception as e:
logger.error(f"Error loading image data: {e}")
def update_servers(self): def update_servers(self):
found_servers = list(set(ZeroconfServer.found_hostnames() + self.added_hostnames)) found_servers = list(set(ZeroconfServer.found_hostnames() + self.added_hostnames))
@@ -401,7 +431,7 @@ class MainWindow(QMainWindow):
for hostname in found_servers: for hostname in found_servers:
if hostname not in current_server_list: if hostname not in current_server_list:
properties = ZeroconfServer.get_hostname_properties(hostname) properties = ZeroconfServer.get_hostname_properties(hostname)
image_path = os.path.join(resources_dir(), 'icons', f"{properties.get('system_os', 'Monitor')}.png") image_path = os.path.join(resources_dir(), f"{properties.get('system_os', 'Monitor')}.png")
list_widget = QListWidgetItem(QIcon(image_path), hostname) list_widget = QListWidgetItem(QIcon(image_path), hostname)
self.server_list_view.addItem(list_widget) self.server_list_view.addItem(list_widget)
@@ -438,23 +468,22 @@ class MainWindow(QMainWindow):
# Top Toolbar Buttons # Top Toolbar Buttons
self.topbar.add_button( self.topbar.add_button(
"New Job", f"{resources_directory}/icons/AddProduct.png", self.new_job) "Console", f"{resources_directory}/Console.png", self.open_console_window)
self.topbar.add_button( self.topbar.add_button(
"Engines", f"{resources_directory}/icons/SoftwareInstaller.png", self.engine_browser) "Engines", f"{resources_directory}/SoftwareInstaller.png", self.engine_browser)
self.topbar.add_button(
"Console", f"{resources_directory}/icons/Console.png", self.open_console_window)
self.topbar.add_separator() self.topbar.add_separator()
self.topbar.add_button( self.topbar.add_button(
"Stop Job", f"{resources_directory}/icons/StopSign.png", self.stop_job) "Stop Job", f"{resources_directory}/StopSign.png", self.stop_job)
self.topbar.add_button( self.topbar.add_button(
"Delete Job", f"{resources_directory}/icons/Trash.png", self.delete_job) "Delete Job", f"{resources_directory}/Trash.png", self.delete_job)
self.topbar.add_button( self.topbar.add_button(
"Render Log", f"{resources_directory}/icons/Document.png", self.job_logs) "Render Log", f"{resources_directory}/Document.png", self.job_logs)
self.topbar.add_button( self.topbar.add_button(
"Download", f"{resources_directory}/icons/Download.png", self.download_files) "Download", f"{resources_directory}/Download.png", self.download_files)
self.topbar.add_button( self.topbar.add_button(
"Open Files", f"{resources_directory}/icons/SearchFolder.png", self.open_files) "Open Files", f"{resources_directory}/SearchFolder.png", self.open_files)
self.topbar.add_button(
"New Job", f"{resources_directory}/AddProduct.png", self.new_job)
self.addToolBar(Qt.ToolBarArea.TopToolBarArea, self.topbar) self.addToolBar(Qt.ToolBarArea.TopToolBarArea, self.topbar)
# -- Toolbar Buttons -- # # -- Toolbar Buttons -- #
@@ -483,7 +512,7 @@ class MainWindow(QMainWindow):
def stop_job(self, event): def stop_job(self, event):
""" """
Event handler for the "Exit" button. Closes the application. Event handler for the Stop Job button
""" """
job_ids = self.selected_job_ids() job_ids = self.selected_job_ids()
if not job_ids: if not job_ids:
@@ -493,14 +522,14 @@ class MainWindow(QMainWindow):
job = next((job for job in self.current_server_proxy.get_all_jobs() if job.get('id') == job_ids[0]), None) job = next((job for job in self.current_server_proxy.get_all_jobs() if job.get('id') == job_ids[0]), None)
if job: if job:
display_name = job.get('name', os.path.basename(job.get('input_path', ''))) display_name = job.get('name', os.path.basename(job.get('input_path', '')))
message = f"Are you sure you want to delete the job:\n{display_name}?" message = f"Are you sure you want to stop the job:\n{display_name}?"
else: else:
return # Job not found, handle this case as needed return # Job not found, handle this case as needed
else: else:
message = f"Are you sure you want to delete these {len(job_ids)} jobs?" message = f"Are you sure you want to stop these {len(job_ids)} jobs?"
# Display the message box and check the response in one go # Display the message box and check the response in one go
msg_box = QMessageBox(QMessageBox.Icon.Warning, "Delete Job", message, msg_box = QMessageBox(QMessageBox.Icon.Warning, "Stop Job", message,
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No, self) QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No, self)
if msg_box.exec() == QMessageBox.StandardButton.Yes: if msg_box.exec() == QMessageBox.StandardButton.Yes:
@@ -536,7 +565,15 @@ class MainWindow(QMainWindow):
self.fetch_jobs(clear_table=True) self.fetch_jobs(clear_table=True)
def download_files(self, event): def download_files(self, event):
pass
job_ids = self.selected_job_ids()
if not job_ids:
return
import webbrowser
download_url = (f"http://{self.current_server_proxy.hostname}:{self.current_server_proxy.port}"
f"/api/job/{job_ids[0]}/download_all")
webbrowser.open(download_url)
def open_files(self, event): def open_files(self, event):
job_ids = self.selected_job_ids() job_ids = self.selected_job_ids()
@@ -546,15 +583,7 @@ class MainWindow(QMainWindow):
for job_id in job_ids: for job_id in job_ids:
job_info = self.current_server_proxy.get_job_info(job_id) job_info = self.current_server_proxy.get_job_info(job_id)
path = os.path.dirname(job_info['output_path']) path = os.path.dirname(job_info['output_path'])
launch_url(path)
if sys.platform.startswith('darwin'):
subprocess.run(['open', path])
elif sys.platform.startswith('win32'):
os.startfile(path)
elif sys.platform.startswith('linux'):
subprocess.run(['xdg-open', path])
else:
raise OSError("Unsupported operating system")
def new_job(self) -> None: def new_job(self) -> None:
+481
View File
@@ -0,0 +1,481 @@
import os
from pathlib import Path
import humanize
import socket
from datetime import datetime
from PyQt6 import QtCore
from PyQt6.QtCore import Qt, QSettings, pyqtSignal as Signal
from PyQt6.QtGui import QIcon
from PyQt6.QtWidgets import QApplication, QMainWindow, QListWidget, QListWidgetItem, QStackedWidget, QVBoxLayout, \
QWidget, QLabel, QCheckBox, QLineEdit, \
QComboBox, QPushButton, QHBoxLayout, QGroupBox, QTableWidget, QAbstractItemView, QTableWidgetItem, QHeaderView, \
QMessageBox
from api.server_proxy import RenderServerProxy
from engines.engine_manager import EngineManager
from utilities.config import Config
from utilities.misc_helper import launch_url, system_safe_path
from version import APP_AUTHOR, APP_NAME
settings = QSettings(APP_AUTHOR, APP_NAME)
class SettingsWindow(QMainWindow):
def __init__(self):
super().__init__()
if not EngineManager.engines_path: # fix issue where sometimes path was not set
EngineManager.engines_path = system_safe_path(
os.path.join(os.path.join(os.path.expanduser(Config.upload_folder),
'engines')))
self.installed_engines_table = None
self.setWindowTitle("Settings")
# Create the main layout
main_layout = QVBoxLayout()
# Create the sidebar (QListWidget) for navigation
self.sidebar = QListWidget()
self.sidebar.setFixedWidth(150)
# Set the icon size
self.sidebar.setIconSize(QtCore.QSize(32, 32)) # Increase the icon size to 32x32 pixels
# Adjust the font size for the sidebar items
font = self.sidebar.font()
font.setPointSize(12) # Increase the font size
self.sidebar.setFont(font)
# Add items with icons to the sidebar
resources_dir = os.path.join(Path(__file__).resolve().parent.parent.parent, 'resources')
self.add_sidebar_item("General", os.path.join(resources_dir, "Gear.png"))
self.add_sidebar_item("Server", os.path.join(resources_dir, "Server.png"))
self.add_sidebar_item("Engines", os.path.join(resources_dir, "Blender.png"))
self.sidebar.setCurrentRow(0)
# Create the stacked widget to hold different settings pages
self.stacked_widget = QStackedWidget()
# Create pages for each section
general_page = self.create_general_page()
network_page = self.create_network_page()
engines_page = self.create_engines_page()
# Add pages to the stacked widget
self.stacked_widget.addWidget(general_page)
self.stacked_widget.addWidget(network_page)
self.stacked_widget.addWidget(engines_page)
# Connect the sidebar to the stacked widget
self.sidebar.currentRowChanged.connect(self.stacked_widget.setCurrentIndex)
# Create a horizontal layout to hold the sidebar and stacked widget
content_layout = QHBoxLayout()
content_layout.addWidget(self.sidebar)
content_layout.addWidget(self.stacked_widget)
# Add the content layout to the main layout
main_layout.addLayout(content_layout)
# Add the "OK" button at the bottom
ok_button = QPushButton("OK")
ok_button.clicked.connect(self.close)
ok_button.setFixedWidth(80)
ok_button.setDefault(True)
main_layout.addWidget(ok_button, alignment=Qt.AlignmentFlag.AlignRight)
# Create a central widget and set the layout
central_widget = QWidget()
central_widget.setLayout(main_layout)
self.setCentralWidget(central_widget)
self.setMinimumSize(700, 400)
def add_sidebar_item(self, name, icon_path):
"""Add an item with an icon to the sidebar."""
item = QListWidgetItem(QIcon(icon_path), name)
self.sidebar.addItem(item)
def create_general_page(self):
"""Create the General settings page."""
page = QWidget()
layout = QVBoxLayout()
# Startup Settings Group
startup_group = QGroupBox("Startup Settings")
startup_layout = QVBoxLayout()
# startup_layout.addWidget(QCheckBox("Start application on system startup"))
check_for_updates_checkbox = QCheckBox("Check for updates automatically")
check_for_updates_checkbox.setChecked(settings.value("auto_check_for_updates", True))
check_for_updates_checkbox.stateChanged.connect(lambda state: settings.setValue("auto_check_for_updates", bool(state)))
startup_layout.addWidget(check_for_updates_checkbox)
startup_group.setLayout(startup_layout)
# Local Files Group
data_path = os.path.expanduser(Config.upload_folder)
path_size = sum(f.stat().st_size for f in Path(data_path).rglob('*') if f.is_file())
database_group = QGroupBox("Local Files")
database_layout = QVBoxLayout()
database_layout.addWidget(QLabel(f"Local Directory: {data_path}"))
database_layout.addWidget(QLabel(f"Size: {humanize.naturalsize(path_size, binary=True)}"))
open_database_path_button = QPushButton("Open Directory")
open_database_path_button.clicked.connect(lambda: launch_url(data_path))
open_database_path_button.setFixedWidth(200)
database_layout.addWidget(open_database_path_button)
database_group.setLayout(database_layout)
# Render Settings Group
render_settings_group = QGroupBox("Render Settings")
render_settings_layout = QVBoxLayout()
render_settings_layout.addWidget(QLabel("Restrict to render nodes with same:"))
require_same_engine_checkbox = QCheckBox("Renderer Version")
require_same_engine_checkbox.setChecked(settings.value("render_require_same_engine_version"))
require_same_engine_checkbox.stateChanged.connect(lambda state: settings.setValue("render_require_same_engine_version", bool(state)))
render_settings_layout.addWidget(require_same_engine_checkbox)
require_same_cpu_checkbox = QCheckBox("CPU Architecture")
require_same_cpu_checkbox.setChecked(settings.value("render_require_same_cpu_type"))
require_same_cpu_checkbox.stateChanged.connect(lambda state: settings.setValue("render_require_same_cpu_type", bool(state)))
render_settings_layout.addWidget(require_same_cpu_checkbox)
require_same_os_checkbox = QCheckBox("Operating System")
require_same_os_checkbox.setChecked(settings.value("render_require_same_os"))
require_same_os_checkbox.stateChanged.connect(lambda state: settings.setValue("render_require_same_os", bool(state)))
render_settings_layout.addWidget(require_same_os_checkbox)
render_settings_group.setLayout(render_settings_layout)
layout.addWidget(startup_group)
layout.addWidget(database_group)
layout.addWidget(render_settings_group)
layout.addStretch() # Add a stretch to push content to the top
page.setLayout(layout)
return page
def create_network_page(self):
"""Create the Network settings page."""
page = QWidget()
layout = QVBoxLayout()
# Sharing Settings Group
sharing_group = QGroupBox("Sharing Settings")
sharing_layout = QVBoxLayout()
enable_sharing_checkbox = QCheckBox("Enable other computers on the network to render to this machine")
enable_sharing_checkbox.setChecked(settings.value("enable_network_sharing", False))
enable_sharing_checkbox.stateChanged.connect(self.toggle_render_sharing)
sharing_layout.addWidget(enable_sharing_checkbox)
password_layout = QHBoxLayout()
password_layout.setContentsMargins(0, 0, 0, 0)
self.enable_network_password_checkbox = QCheckBox("Enable network password:")
self.enable_network_password_checkbox.setChecked(settings.value("enable_network_password", False))
self.enable_network_password_checkbox.stateChanged.connect(self.enable_network_password_changed)
sharing_layout.addWidget(self.enable_network_password_checkbox)
self.network_password_line = QLineEdit()
self.network_password_line.setPlaceholderText("Enter a password")
self.network_password_line.setEchoMode(QLineEdit.EchoMode.Password)
self.network_password_line.setEnabled(settings.value("enable_network_password", False))
password_layout.addWidget(self.network_password_line)
self.show_password_button = QPushButton("Show")
self.show_password_button.setEnabled(settings.value("enable_network_password", False))
self.show_password_button.clicked.connect(self.show_password_button_pressed)
password_layout.addWidget(self.show_password_button)
sharing_layout.addLayout(password_layout)
sharing_group.setLayout(sharing_layout)
layout.addWidget(sharing_group)
layout.addStretch() # Add a stretch to push content to the top
page.setLayout(layout)
return page
def toggle_render_sharing(self, enable_sharing):
settings.setValue("enable_network_sharing", enable_sharing)
self.enable_network_password_checkbox.setEnabled(enable_sharing)
enable_password = enable_sharing and settings.value("enable_network_password", False)
self.network_password_line.setEnabled(enable_password)
self.show_password_button.setEnabled(enable_password)
def enable_network_password_changed(self, new_value):
settings.setValue("enable_network_password", new_value)
self.network_password_line.setEnabled(new_value)
self.show_password_button.setEnabled(new_value)
def show_password_button_pressed(self):
# toggle showing / hiding the password
show_pass = self.show_password_button.text() == "Show"
self.show_password_button.setText("Hide" if show_pass else "Show")
self.network_password_line.setEchoMode(QLineEdit.EchoMode.Normal if show_pass else QLineEdit.EchoMode.Normal)
def create_engines_page(self):
"""Create the Engines settings page."""
page = QWidget()
layout = QVBoxLayout()
# Installed Engines Group
installed_group = QGroupBox("Installed Engines")
installed_layout = QVBoxLayout()
# Setup table
self.installed_engines_table = EngineTableWidget()
self.installed_engines_table.row_selected.connect(self.engine_table_selected)
installed_layout.addWidget(self.installed_engines_table)
# Ignore system installs
engine_ignore_system_installs_checkbox = QCheckBox("Ignore system installs")
engine_ignore_system_installs_checkbox.setChecked(settings.value("engines_ignore_system_installs", False))
engine_ignore_system_installs_checkbox.stateChanged.connect(self.change_ignore_system_installs)
installed_layout.addWidget(engine_ignore_system_installs_checkbox)
# Engine Launch / Delete buttons
installed_buttons_layout = QHBoxLayout()
self.launch_engine_button = QPushButton("Launch")
self.launch_engine_button.setEnabled(False)
self.launch_engine_button.clicked.connect(self.launch_selected_engine)
self.delete_engine_button = QPushButton("Delete")
self.delete_engine_button.setEnabled(False)
self.delete_engine_button.clicked.connect(self.delete_selected_engine)
installed_buttons_layout.addWidget(self.launch_engine_button)
installed_buttons_layout.addWidget(self.delete_engine_button)
installed_layout.addLayout(installed_buttons_layout)
installed_group.setLayout(installed_layout)
# Engine Updates Group
engine_updates_group = QGroupBox("Auto-Install")
engine_updates_layout = QVBoxLayout()
engine_download_layout = QHBoxLayout()
engine_download_layout.addWidget(QLabel("Enable Downloads for:"))
at_least_one_downloadable = False
for engine in EngineManager.downloadable_engines():
engine_download_check = QCheckBox(engine.name())
is_checked = settings.value(f"engine_download-{engine.name()}", False)
at_least_one_downloadable |= is_checked
engine_download_check.setChecked(is_checked)
# Capture the checkbox correctly using a default argument in lambda
engine_download_check.clicked.connect(
lambda state, checkbox=engine_download_check: self.engine_download_settings_changed(state, checkbox.text())
)
engine_download_layout.addWidget(engine_download_check)
engine_updates_layout.addLayout(engine_download_layout)
check_for_engine_updates_checkbox = QCheckBox("Check for new versions on launch")
check_for_engine_updates_checkbox.setChecked(settings.value('check_for_engine_updates_on_launch', True))
check_for_engine_updates_checkbox.setEnabled(at_least_one_downloadable)
check_for_engine_updates_checkbox.stateChanged.connect(
lambda state: settings.setValue("check_for_engine_updates_on_launch", bool(state)))
engine_updates_layout.addWidget(check_for_engine_updates_checkbox)
self.engines_last_update_label = QLabel()
self.update_last_checked_label()
self.engines_last_update_label.setEnabled(at_least_one_downloadable)
engine_updates_layout.addWidget(self.engines_last_update_label)
self.check_for_new_engines_button = QPushButton("Check for New Versions...")
self.check_for_new_engines_button.setEnabled(at_least_one_downloadable)
self.check_for_new_engines_button.clicked.connect(self.check_for_new_engines)
engine_updates_layout.addWidget(self.check_for_new_engines_button)
engine_updates_group.setLayout(engine_updates_layout)
layout.addWidget(installed_group)
layout.addWidget(engine_updates_group)
layout.addStretch() # Add a stretch to push content to the top
page.setLayout(layout)
return page
def change_ignore_system_installs(self, value):
settings.setValue("engines_ignore_system_installs", bool(value))
self.installed_engines_table.update_table()
def update_last_checked_label(self):
"""Retrieve the last check timestamp and return a human-friendly string."""
last_checked_str = settings.value("engines_last_update_time", None)
if not last_checked_str:
time_string = "Never"
else:
last_checked_dt = datetime.fromisoformat(last_checked_str)
now = datetime.now()
time_string = humanize.naturaltime(now - last_checked_dt)
self.engines_last_update_label.setText(f"Last Updated: {time_string}")
def engine_download_settings_changed(self, state, engine_name):
settings.setValue(f"engine_download-{engine_name}", state)
at_least_one_downloadable = False
for engine in EngineManager.downloadable_engines():
at_least_one_downloadable |= settings.value(f"engine_download-{engine.name()}", False)
self.check_for_new_engines_button.setEnabled(at_least_one_downloadable)
self.check_for_engine_updates_checkbox.setEnabled(at_least_one_downloadable)
self.engines_last_update_label.setEnabled(at_least_one_downloadable)
def delete_selected_engine(self):
engine_info = self.installed_engines_table.selected_engine_data()
reply = QMessageBox.question(self, f"Delete {engine_info['engine']} {engine_info['version']}?",
f"Do you want to delete {engine_info['engine']} {engine_info['version']}?",
QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No)
if reply is not QMessageBox.StandardButton.Yes:
return
delete_result = EngineManager.delete_engine_download(engine_info.get('engine'),
engine_info.get('version'),
engine_info.get('system_os'),
engine_info.get('cpu'))
if delete_result:
QMessageBox.information(self, f"{engine_info['engine']} {engine_info['version']} Deleted",
f"{engine_info['engine']} {engine_info['version']} deleted successfully",
QMessageBox.StandardButton.Ok)
else:
QMessageBox.warning(self, f"Unknown Error",
f"Unknown error while deleting {engine_info['engine']} {engine_info['version']}.",
QMessageBox.StandardButton.Ok)
self.installed_engines_table.update_table(use_cached=False)
def launch_selected_engine(self):
engine_info = self.installed_engines_table.selected_engine_data()
if engine_info:
launch_url(engine_info['path'])
def engine_table_selected(self):
engine_data = self.installed_engines_table.selected_engine_data()
if engine_data:
self.launch_engine_button.setEnabled(bool(engine_data.get('path') or True))
self.delete_engine_button.setEnabled(engine_data.get('type') == 'managed')
else:
self.launch_engine_button.setEnabled(False)
self.delete_engine_button.setEnabled(False)
def check_for_new_engines(self):
ignore_system = settings.value("engines_ignore_system_installs", False)
messagebox_shown = False
for engine in EngineManager.downloadable_engines():
if settings.value(f'engine_download-{engine.name()}', False):
result = EngineManager.is_engine_update_available(engine, ignore_system_installs=ignore_system)
if result:
result['name'] = engine.name()
msg_box = QMessageBox()
msg_box.setWindowTitle(f"{result['name']} ({result['version']}) Available")
msg_box.setText(f"A new version of {result['name']} is available ({result['version']}).\n\n"
f"Would you like to download it now?")
msg_box.setIcon(QMessageBox.Icon.Question)
msg_box.setStandardButtons(QMessageBox.StandardButton.Yes | QMessageBox.StandardButton.No)
msg_result = msg_box.exec()
messagebox_shown = True
if msg_result == QMessageBox.StandardButton.Yes:
EngineManager.download_engine(engine=engine.name(), version=result['version'], background=True,
ignore_system=ignore_system)
self.update_engine_download_status()
if not messagebox_shown:
msg_box = QMessageBox()
msg_box.setWindowTitle("No Updates Available")
msg_box.setText("All your render engines are up-to-date.")
msg_box.setIcon(QMessageBox.Icon.Information)
msg_box.setStandardButtons(QMessageBox.StandardButton.Ok)
msg_box.exec()
settings.setValue("engines_last_update_time", datetime.now().isoformat())
self.update_engine_download_status()
def update_engine_download_status(self):
running_tasks = [x for x in EngineManager.download_tasks if x.is_alive()]
if not running_tasks:
self.update_last_checked_label()
return
self.engines_last_update_label.setText(f"Downloading {running_tasks[0].engine} ({running_tasks[0].version})...")
class EngineTableWidget(QWidget):
row_selected = Signal()
def __init__(self):
super().__init__()
self.table = QTableWidget(0, 4)
self.table.setHorizontalHeaderLabels(["Engine", "Version", "Type", "Path"])
self.table.setSelectionBehavior(QAbstractItemView.SelectionBehavior.SelectRows)
self.table.verticalHeader().setVisible(False)
# self.table_widget.itemSelectionChanged.connect(self.engine_picked)
self.table.setEditTriggers(QAbstractItemView.EditTrigger.NoEditTriggers)
self.table.selectionModel().selectionChanged.connect(self.on_selection_changed)
layout = QVBoxLayout(self)
layout.setContentsMargins(0, 0, 0, 0)
layout.setSpacing(0)
layout.addWidget(self.table)
self.raw_server_data = None
def showEvent(self, event):
"""Runs when the widget is about to be shown."""
self.update_table()
super().showEvent(event) # Ensure normal event processing
def update_table(self, use_cached=True):
if not self.raw_server_data or not use_cached:
self.raw_server_data = RenderServerProxy(socket.gethostname()).get_renderer_info()
if not self.raw_server_data:
return
table_data = [] # convert the data into a flat list
for _, engine_data in self.raw_server_data.items():
table_data.extend(engine_data['versions'])
if settings.value("engines_ignore_system_installs", False):
table_data = [x for x in table_data if x['type'] != 'system']
self.table.clear()
self.table.setRowCount(len(table_data))
self.table.setColumnCount(4)
self.table.setHorizontalHeaderLabels(['Engine', 'Version', 'Type', 'Path'])
self.table.horizontalHeader().setSectionResizeMode(0, QHeaderView.ResizeMode.Fixed)
self.table.horizontalHeader().setSectionResizeMode(1, QHeaderView.ResizeMode.Fixed)
self.table.horizontalHeader().setSectionResizeMode(2, QHeaderView.ResizeMode.Fixed)
self.table.horizontalHeader().setSectionResizeMode(3, QHeaderView.ResizeMode.Stretch)
for row, engine in enumerate(table_data):
self.table.setItem(row, 0, QTableWidgetItem(engine['engine']))
self.table.setItem(row, 1, QTableWidgetItem(engine['version']))
self.table.setItem(row, 2, QTableWidgetItem(engine['type']))
self.table.setItem(row, 3, QTableWidgetItem(engine['path']))
self.table.selectRow(0)
def selected_engine_data(self):
"""Returns the data from the selected row as a dictionary."""
row = self.table.currentRow() # Get the selected row index
if row < 0 or not len(self.table.selectedItems()): # No row selected
return None
data = {
"engine": self.table.item(row, 0).text(),
"version": self.table.item(row, 1).text(),
"type": self.table.item(row, 2).text(),
"path": self.table.item(row, 3).text(),
}
return data
def on_selection_changed(self):
self.row_selected.emit()
if __name__ == "__main__":
app = QApplication([])
window = SettingsWindow()
window.show()
app.exec()
-1
View File
@@ -1 +0,0 @@
''' app/ui/widgets/dialog.py '''
+88 -6
View File
@@ -1,5 +1,6 @@
''' app/ui/widgets/menubar.py ''' ''' app/ui/widgets/menubar.py '''
from PyQt6.QtWidgets import QMenuBar from PyQt6.QtGui import QAction
from PyQt6.QtWidgets import QMenuBar, QApplication, QMessageBox, QDialog, QVBoxLayout, QLabel, QPushButton
class MenuBar(QMenuBar): class MenuBar(QMenuBar):
@@ -12,12 +13,93 @@ class MenuBar(QMenuBar):
def __init__(self, parent=None) -> None: def __init__(self, parent=None) -> None:
super().__init__(parent) super().__init__(parent)
self.settings_window = None
# setup menus
file_menu = self.addMenu("File") file_menu = self.addMenu("File")
# edit_menu = self.addMenu("Edit") # edit_menu = self.addMenu("Edit")
# view_menu = self.addMenu("View") # view_menu = self.addMenu("View")
# help_menu = self.addMenu("Help") help_menu = self.addMenu("Help")
# Add actions to the menus # --file menu--
# file_menu.addAction(self.parent().topbar.actions_call["Open"]) # type: ignore # new job
# file_menu.addAction(self.parent().topbar.actions_call["Save"]) # type: ignore new_job_action = QAction("New Job...", self)
# file_menu.addAction(self.parent().topbar.actions_call["Exit"]) # type: ignore new_job_action.setShortcut(f'Ctrl+N')
new_job_action.triggered.connect(self.new_job)
file_menu.addAction(new_job_action)
# settings
settings_action = QAction("Settings...", self)
settings_action.triggered.connect(self.show_settings)
settings_action.setShortcut(f'Ctrl+,')
file_menu.addAction(settings_action)
# exit
exit_action = QAction('&Exit', self)
exit_action.setShortcut('Ctrl+Q')
exit_action.triggered.connect(QApplication.instance().quit)
file_menu.addAction(exit_action)
# --help menu--
about_action = QAction("About", self)
about_action.triggered.connect(self.show_about)
help_menu.addAction(about_action)
update_action = QAction("Check for Updates...", self)
update_action.triggered.connect(self.check_for_updates)
help_menu.addAction(update_action)
def new_job(self):
self.parent().new_job()
def show_settings(self):
from src.ui.settings_window import SettingsWindow
self.settings_window = SettingsWindow()
self.settings_window.show()
@staticmethod
def show_about():
from src.ui.about_window import AboutDialog
dialog = AboutDialog()
dialog.exec()
@staticmethod
def check_for_updates():
from src.utilities.misc_helper import check_for_updates
from version import APP_NAME, APP_VERSION, APP_REPO_NAME, APP_REPO_OWNER
found_update = check_for_updates(APP_REPO_NAME, APP_REPO_OWNER, APP_NAME, APP_VERSION)
if found_update:
dialog = UpdateDialog(found_update, APP_VERSION)
dialog.exec()
else:
QMessageBox.information(None, "No Update", "No updates available.")
class UpdateDialog(QDialog):
def __init__(self, release_info, current_version, parent=None):
super().__init__(parent)
self.setWindowTitle(f"Update Available ({current_version} -> {release_info['tag_name']})")
layout = QVBoxLayout()
label = QLabel(f"A new version ({release_info['tag_name']}) is available! Current version: {current_version}")
layout.addWidget(label)
# Label to show the release notes
description = QLabel(release_info["body"])
layout.addWidget(description)
# Button to download the latest version
download_button = QPushButton(f"Download Latest Version ({release_info['tag_name']})")
download_button.clicked.connect(lambda: self.open_url(release_info["html_url"]))
layout.addWidget(download_button)
# OK button to dismiss the dialog
ok_button = QPushButton("Dismiss")
ok_button.clicked.connect(self.accept) # Close the dialog when clicked
layout.addWidget(ok_button)
self.setLayout(layout)
def open_url(self, url):
from PyQt6.QtCore import QUrl
from PyQt6.QtGui import QDesktopServices
QDesktopServices.openUrl(QUrl(url))
self.accept()
+17 -7
View File
@@ -9,6 +9,7 @@ from PyQt6.QtGui import QPixmap
from PyQt6.QtWidgets import QStatusBar, QLabel from PyQt6.QtWidgets import QStatusBar, QLabel
from src.api.server_proxy import RenderServerProxy from src.api.server_proxy import RenderServerProxy
from src.engines.engine_manager import EngineManager
from src.utilities.misc_helper import resources_dir from src.utilities.misc_helper import resources_dir
@@ -28,17 +29,26 @@ class StatusBar(QStatusBar):
proxy = RenderServerProxy(socket.gethostname()) proxy = RenderServerProxy(socket.gethostname())
proxy.start_background_update() proxy.start_background_update()
image_names = {'Ready': 'GreenCircle.png', 'Offline': "RedSquare.png"} image_names = {'Ready': 'GreenCircle.png', 'Offline': "RedSquare.png"}
last_update = None
# Check for status change every 1s on background thread # Check for status change every 1s on background thread
while True: while True:
try:
# update status label - get download status
new_status = proxy.status() new_status = proxy.status()
if new_status is not last_update: if EngineManager.download_tasks:
new_image_name = image_names.get(new_status, 'Synchronize.png') if len(EngineManager.download_tasks) == 1:
image_path = os.path.join(resources_dir(), 'icons', new_image_name) task = EngineManager.download_tasks[0]
self.label.setPixmap((QPixmap(image_path).scaled(16, 16, Qt.AspectRatioMode.KeepAspectRatio))) new_status = f"{new_status} | Downloading {task.engine.capitalize()} {task.version}..."
else:
new_status = f"{new_status} | Downloading {len(EngineManager.download_tasks)} engines"
self.messageLabel.setText(new_status) self.messageLabel.setText(new_status)
last_update = new_status
# update status image
new_image_name = image_names.get(new_status, 'Synchronize.png')
new_image_path = os.path.join(resources_dir(), new_image_name)
self.label.setPixmap((QPixmap(new_image_path).scaled(16, 16, Qt.AspectRatioMode.KeepAspectRatio)))
except RuntimeError: # ignore runtime errors during shutdown
pass
time.sleep(1) time.sleep(1)
background_thread = threading.Thread(target=background_update,) background_thread = threading.Thread(target=background_update,)
@@ -47,7 +57,7 @@ class StatusBar(QStatusBar):
# Create a label that holds an image # Create a label that holds an image
self.label = QLabel() self.label = QLabel()
image_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'resources', 'icons', image_path = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(__file__))), 'resources',
'RedSquare.png') 'RedSquare.png')
pixmap = (QPixmap(image_path).scaled(16, 16, Qt.AspectRatioMode.KeepAspectRatio)) pixmap = (QPixmap(image_path).scaled(16, 16, Qt.AspectRatioMode.KeepAspectRatio))
self.label.setPixmap(pixmap) self.label.setPixmap(pixmap)
+78
View File
@@ -0,0 +1,78 @@
import concurrent.futures
import os
import time
import logging
logger = logging.getLogger()
def cpu_workload(n):
# Simple arithmetic operation for workload
while n > 0:
n -= 1
return n
def cpu_benchmark(duration_seconds=10):
# Determine the number of available CPU cores
num_cores = os.cpu_count()
# Calculate workload per core, assuming a large number for the workload
workload_per_core = 10000000
# Record start time
start_time = time.time()
# Use ProcessPoolExecutor to utilize all CPU cores
with concurrent.futures.ProcessPoolExecutor() as executor:
# Launching tasks for each core
futures = [executor.submit(cpu_workload, workload_per_core) for _ in range(num_cores)]
# Wait for all futures to complete, with a timeout to limit the benchmark duration
concurrent.futures.wait(futures, timeout=duration_seconds)
# Record end time
end_time = time.time()
# Calculate the total number of operations (workload) done by all cores
total_operations = workload_per_core * num_cores
# Calculate the total time taken
total_time = end_time - start_time
# Calculate operations per second as the score
score = total_operations / total_time
score = score * 0.0001
return int(score)
def disk_io_benchmark(file_size_mb=100, filename='benchmark_test_file'):
write_speed = None
read_speed = None
# Measure write speed
start_time = time.time()
with open(filename, 'wb') as f:
f.write(os.urandom(file_size_mb * 1024 * 1024)) # Write random bytes to file
end_time = time.time()
write_time = end_time - start_time
write_speed = file_size_mb / write_time
# Measure read speed
start_time = time.time()
with open(filename, 'rb') as f:
content = f.read()
end_time = time.time()
read_time = end_time - start_time
read_speed = file_size_mb / read_time
# Cleanup
os.remove(filename)
logger.debug(f"Disk Write Speed: {write_speed:.2f} MB/s")
logger.debug(f"Disk Read Speed: {read_speed:.2f} MB/s")
return write_speed, read_speed
if __name__ == '__main__':
print(cpu_benchmark())
print(disk_io_benchmark())
+40 -2
View File
@@ -1,5 +1,6 @@
import os import os
import yaml import yaml
from src.utilities.misc_helper import current_system_os, copy_directory_contents
class Config: class Config:
@@ -9,7 +10,7 @@ class Config:
max_content_path = 100000000 max_content_path = 100000000
server_log_level = 'debug' server_log_level = 'debug'
log_buffer_length = 250 log_buffer_length = 250
subjob_connection_timeout = 120 worker_process_timeout = 120
flask_log_level = 'error' flask_log_level = 'error'
flask_debug_enable = False flask_debug_enable = False
queue_eval_seconds = 1 queue_eval_seconds = 1
@@ -27,10 +28,47 @@ class Config:
cls.max_content_path = cfg.get('max_content_path', cls.max_content_path) cls.max_content_path = cfg.get('max_content_path', cls.max_content_path)
cls.server_log_level = cfg.get('server_log_level', cls.server_log_level) cls.server_log_level = cfg.get('server_log_level', cls.server_log_level)
cls.log_buffer_length = cfg.get('log_buffer_length', cls.log_buffer_length) cls.log_buffer_length = cfg.get('log_buffer_length', cls.log_buffer_length)
cls.subjob_connection_timeout = cfg.get('subjob_connection_timeout', cls.subjob_connection_timeout) cls.worker_process_timeout = cfg.get('worker_process_timeout', cls.worker_process_timeout)
cls.flask_log_level = cfg.get('flask_log_level', cls.flask_log_level) cls.flask_log_level = cfg.get('flask_log_level', cls.flask_log_level)
cls.flask_debug_enable = cfg.get('flask_debug_enable', cls.flask_debug_enable) cls.flask_debug_enable = cfg.get('flask_debug_enable', cls.flask_debug_enable)
cls.queue_eval_seconds = cfg.get('queue_eval_seconds', cls.queue_eval_seconds) cls.queue_eval_seconds = cfg.get('queue_eval_seconds', cls.queue_eval_seconds)
cls.port_number = cfg.get('port_number', cls.port_number) cls.port_number = cfg.get('port_number', cls.port_number)
cls.enable_split_jobs = cfg.get('enable_split_jobs', cls.enable_split_jobs) cls.enable_split_jobs = cfg.get('enable_split_jobs', cls.enable_split_jobs)
cls.download_timeout_seconds = cfg.get('download_timeout_seconds', cls.download_timeout_seconds) cls.download_timeout_seconds = cfg.get('download_timeout_seconds', cls.download_timeout_seconds)
@classmethod
def config_dir(cls):
# Set up the config path
if current_system_os() == 'macos':
local_config_path = os.path.expanduser('~/Library/Application Support/Zordon')
elif current_system_os() == 'windows':
local_config_path = os.path.join(os.environ['APPDATA'], 'Zordon')
else:
local_config_path = os.path.expanduser('~/.config/Zordon')
return local_config_path
@classmethod
def setup_config_dir(cls):
# Set up the config path
local_config_dir = cls.config_dir()
if os.path.exists(local_config_dir):
return
try:
# Create the local configuration directory
os.makedirs(local_config_dir)
# Determine the template path
resource_environment_path = os.environ.get('RESOURCEPATH')
if resource_environment_path:
template_path = os.path.join(resource_environment_path, 'config')
else:
template_path = os.path.join(
os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), 'config')
# Copy contents from the template to the local configuration directory
copy_directory_contents(template_path, local_config_dir)
except Exception as e:
print(f"An error occurred while setting up the config directory: {e}")
raise
+4 -3
View File
@@ -4,9 +4,10 @@ from src.engines.ffmpeg.ffmpeg_engine import FFMPEG
def image_sequence_to_video(source_glob_pattern, output_path, framerate=24, encoder="prores_ks", profile=4, def image_sequence_to_video(source_glob_pattern, output_path, framerate=24, encoder="prores_ks", profile=4,
start_frame=1): start_frame=1):
subprocess.run([FFMPEG.default_renderer_path(), "-framerate", str(framerate), "-start_number", str(start_frame), "-i", subprocess.run([FFMPEG.default_renderer_path(), "-framerate", str(framerate), "-start_number",
f"{source_glob_pattern}", "-c:v", encoder, "-profile:v", str(profile), '-pix_fmt', 'yuva444p10le', str(start_frame), "-i", f"{source_glob_pattern}", "-c:v", encoder, "-profile:v", str(profile),
output_path], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, check=True) '-pix_fmt', 'yuva444p10le', output_path], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL,
check=True)
def save_first_frame(source_path, dest_path, max_width=1280): def save_first_frame(source_path, dest_path, max_width=1280):
+93 -29
View File
@@ -1,7 +1,9 @@
import logging import logging
import os import os
import platform import platform
import shutil
import socket import socket
import string
import subprocess import subprocess
from datetime import datetime from datetime import datetime
@@ -9,14 +11,27 @@ logger = logging.getLogger()
def launch_url(url): def launch_url(url):
if subprocess.run(['which', 'xdg-open'], capture_output=True).returncode == 0: logger = logging.getLogger(__name__)
subprocess.run(['xdg-open', url]) # linux
elif subprocess.run(['which', 'open'], capture_output=True).returncode == 0: if shutil.which('xdg-open'):
subprocess.run(['open', url]) # macos opener = 'xdg-open'
elif subprocess.run(['which', 'start'], capture_output=True).returncode == 0: elif shutil.which('open'):
subprocess.run(['start', url]) # windows - need to validate this works opener = 'open'
elif shutil.which('cmd'):
opener = 'start'
else: else:
logger.error(f"No valid launchers found to launch url: {url}") error_message = f"No valid launchers found to launch URL: {url}"
logger.error(error_message)
raise OSError(error_message)
try:
if opener == 'start':
# For Windows, use 'cmd /c start'
subprocess.run(['cmd', '/c', 'start', url], shell=False)
else:
subprocess.run([opener, url])
except Exception as e:
logger.error(f"Failed to launch URL: {url}. Error: {e}")
def file_exists_in_mounts(filepath): def file_exists_in_mounts(filepath):
@@ -34,9 +49,9 @@ def file_exists_in_mounts(filepath):
path = os.path.normpath(path) path = os.path.normpath(path)
components = [] components = []
while True: while True:
path, component = os.path.split(path) path, comp = os.path.split(path)
if component: if comp:
components.append(component) components.append(comp)
else: else:
if path: if path:
components.append(path) components.append(path)
@@ -62,20 +77,17 @@ def file_exists_in_mounts(filepath):
def get_time_elapsed(start_time=None, end_time=None): def get_time_elapsed(start_time=None, end_time=None):
from string import Template
class DeltaTemplate(Template):
delimiter = "%"
def strfdelta(tdelta, fmt='%H:%M:%S'): def strfdelta(tdelta, fmt='%H:%M:%S'):
d = {"D": tdelta.days} days = tdelta.days
hours, rem = divmod(tdelta.seconds, 3600) hours, rem = divmod(tdelta.seconds, 3600)
minutes, seconds = divmod(rem, 60) minutes, seconds = divmod(rem, 60)
d["H"] = '{:02d}'.format(hours)
d["M"] = '{:02d}'.format(minutes) # Using f-strings for formatting
d["S"] = '{:02d}'.format(seconds) formatted_str = fmt.replace('%D', f'{days}')
t = DeltaTemplate(fmt) formatted_str = formatted_str.replace('%H', f'{hours:02d}')
return t.substitute(**d) formatted_str = formatted_str.replace('%M', f'{minutes:02d}')
formatted_str = formatted_str.replace('%S', f'{seconds:02d}')
return formatted_str
# calculate elapsed time # calculate elapsed time
elapsed_time = None elapsed_time = None
@@ -93,7 +105,7 @@ def get_time_elapsed(start_time=None, end_time=None):
def get_file_size_human(file_path): def get_file_size_human(file_path):
size_in_bytes = os.path.getsize(file_path) size_in_bytes = os.path.getsize(file_path)
# Convert size to a human readable format # Convert size to a human-readable format
if size_in_bytes < 1024: if size_in_bytes < 1024:
return f"{size_in_bytes} B" return f"{size_in_bytes} B"
elif size_in_bytes < 1024 ** 2: elif size_in_bytes < 1024 ** 2:
@@ -127,15 +139,51 @@ def current_system_cpu():
def resources_dir(): def resources_dir():
resources_directory = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), resource_environment_path = os.environ.get('RESOURCEPATH', None)
'resources') if resource_environment_path: # running inside resource bundle
return resources_directory return os.path.join(resource_environment_path, 'resources')
else:
return os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), 'resources')
def config_dir(): def copy_directory_contents(src_dir, dst_dir):
config_directory = os.path.join(os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))), """
'config') Copy the contents of the source directory (src_dir) to the destination directory (dst_dir).
return config_directory """
for item in os.listdir(src_dir):
src_path = os.path.join(src_dir, item)
dst_path = os.path.join(dst_dir, item)
if os.path.isdir(src_path):
shutil.copytree(src_path, dst_path, dirs_exist_ok=True)
else:
shutil.copy2(src_path, dst_path)
def check_for_updates(repo_name, repo_owner, app_name, current_version):
def get_github_releases(owner, repo):
import requests
url = f"https://api.github.com/repos/{owner}/{repo}/releases"
try:
response = requests.get(url, timeout=3)
response.raise_for_status()
releases = response.json()
return releases
except Exception as e:
logger.error(f"Error checking for updates: {e}")
return []
releases = get_github_releases(repo_owner, repo_name)
if not releases:
return
latest_version = releases[0]
latest_version_tag = latest_version['tag_name']
from packaging import version
if version.parse(latest_version_tag) > version.parse(current_version):
logger.info(f"Newer version of {app_name} available. "
f"Latest: {latest_version_tag}, Current: {current_version}")
return latest_version
def is_localhost(comparison_hostname): def is_localhost(comparison_hostname):
@@ -146,3 +194,19 @@ def is_localhost(comparison_hostname):
return comparison_hostname == local_hostname return comparison_hostname == local_hostname
except AttributeError: except AttributeError:
return False return False
def num_to_alphanumeric(num):
# List of possible alphanumeric characters
characters = string.ascii_letters + string.digits
# Make sure number is positive
num = abs(num)
# Convert number to alphanumeric
result = ""
while num > 0:
num, remainder = divmod(num, len(characters))
result += characters[remainder]
return result[::-1] # Reverse the result to get the correct alphanumeric string
+188 -35
View File
@@ -1,47 +1,200 @@
import logging import logging
import os import os
import subprocess import zipfile
import threading from concurrent.futures import ThreadPoolExecutor
from src.utilities.ffmpeg_helper import generate_thumbnail, save_first_frame import requests
from src.api.server_proxy import RenderServerProxy
from src.utilities.misc_helper import get_file_size_human
from src.utilities.zeroconf_server import ZeroconfServer
logger = logging.getLogger() logger = logging.getLogger()
def generate_thumbnail_for_job(job, thumb_video_path, thumb_image_path, max_width=320): def download_missing_frames_from_subjob(local_job, subjob_id, subjob_hostname):
success = True
# Simple thread to generate thumbs in background
def generate_thumb_thread(source):
in_progress_path = thumb_video_path + '_IN-PROGRESS'
subprocess.run(['touch', in_progress_path])
try: try:
logger.debug(f"Generating video thumbnail for {source}") local_files = [os.path.basename(x) for x in local_job.file_list()]
generate_thumbnail(source_path=source, dest_path=thumb_video_path, max_width=max_width) subjob_proxy = RenderServerProxy(subjob_hostname)
except subprocess.CalledProcessError as err: subjob_files = subjob_proxy.get_job_files_list(job_id=subjob_id) or []
logger.error(f"Error generating video thumbnail for {source}: {err}")
for subjob_filename in subjob_files:
if subjob_filename not in local_files:
try: try:
os.remove(in_progress_path) logger.debug(f"Downloading new file '{subjob_filename}' from {subjob_hostname}")
except FileNotFoundError: local_save_path = os.path.join(os.path.dirname(local_job.output_path), subjob_filename)
pass subjob_proxy.download_job_file(job_id=subjob_id, job_filename=subjob_filename,
save_path=local_save_path)
# Determine best source file to use for thumbs logger.debug(f'Downloaded successfully - {local_save_path}')
source_files = job.file_list() or [job.input_path]
if source_files:
video_formats = ['.mp4', '.mov', '.avi', '.mpg', '.mpeg', '.mxf', '.m4v', 'mkv']
image_formats = ['.jpg', '.png', '.exr']
image_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in image_formats]
video_files = [f for f in source_files if os.path.splitext(f)[-1].lower() in video_formats]
if (video_files or image_files) and not os.path.exists(thumb_image_path):
try:
path_of_source = image_files[0] if image_files else video_files[0]
logger.debug(f"Generating image thumbnail for {path_of_source}")
save_first_frame(source_path=path_of_source, dest_path=thumb_image_path, max_width=max_width)
except Exception as e: except Exception as e:
logger.error(f"Exception saving first frame: {e}") logger.error(f"Error downloading file '{subjob_filename}' from {subjob_hostname}: {e}")
success = False
except Exception as e:
logger.exception(f'Uncaught exception while trying to download from subjob: {e}')
success = False
return success
if video_files and not os.path.exists(thumb_video_path):
x = threading.Thread(target=generate_thumb_thread, args=(video_files[0],)) def download_all_from_subjob(local_job, subjob_id, subjob_hostname):
x.start() """
Downloads and extracts files from a completed subjob on a remote server.
Parameters:
local_job (BaseRenderWorker): The local parent job worker.
subjob_id (str or int): The ID of the subjob.
subjob_hostname (str): The hostname of the remote server where the subjob is located.
Returns:
bool: True if the files have been downloaded and extracted successfully, False otherwise.
"""
child_key = f'{subjob_id}@{subjob_hostname}'
logname = f"{local_job.id}:{child_key}"
zip_file_path = local_job.output_path + f'_{subjob_hostname}_{subjob_id}.zip'
# download zip file from server
try:
local_job.children[child_key]['download_status'] = 'working'
logger.info(f"Downloading completed subjob files from {subjob_hostname} to localhost")
RenderServerProxy(subjob_hostname).download_all_job_files(subjob_id, zip_file_path)
logger.info(f"File transfer complete for {logname} - Transferred {get_file_size_human(zip_file_path)}")
except Exception as e:
logger.error(f"Error downloading files from remote server: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return False
# extract zip
try:
logger.debug(f"Extracting zip file: {zip_file_path}")
extract_path = os.path.dirname(zip_file_path)
with zipfile.ZipFile(zip_file_path, 'r') as zip_ref:
zip_ref.extractall(extract_path)
logger.info(f"Successfully extracted zip to: {extract_path}")
os.remove(zip_file_path)
local_job.children[child_key]['download_status'] = 'complete'
except Exception as e:
logger.exception(f"Exception extracting zip file: {e}")
local_job.children[child_key]['download_status'] = 'failed'
return local_job.children[child_key].get('download_status', None) == 'complete'
def distribute_server_work(start_frame, end_frame, available_servers, method='evenly'):
"""
Splits the frame range among available servers proportionally based on their performance (CPU count).
Args:
start_frame (int): The start frame number of the animation to be rendered.
end_frame (int): The end frame number of the animation to be rendered.
available_servers (list): A list of available server dictionaries. Each server dictionary should include
'hostname' and 'cpu_count' keys (see find_available_servers).
method (str, optional): Specifies the distribution method. Possible values are 'cpu_benchmark', 'cpu_count'
and 'evenly'.
Defaults to 'cpu_benchmark'.
Returns:
list: A list of server dictionaries where each dictionary includes the frame range and total number of
frames to be rendered by the server.
"""
# Calculate respective frames for each server
def divide_frames_by_cpu_count(frame_start, frame_end, servers):
total_frames = frame_end - frame_start + 1
total_cpus = sum(server['cpu_count'] for server in servers)
frame_ranges = {}
current_frame = frame_start
allocated_frames = 0
for i, server in enumerate(servers):
if i == len(servers) - 1: # if it's the last server
# Give all remaining frames to the last server
num_frames = total_frames - allocated_frames
else:
num_frames = round((server['cpu_count'] / total_cpus) * total_frames)
allocated_frames += num_frames
frame_end_for_server = current_frame + num_frames - 1
if current_frame <= frame_end_for_server:
frame_ranges[server['hostname']] = (current_frame, frame_end_for_server)
current_frame = frame_end_for_server + 1
return frame_ranges
def divide_frames_by_benchmark(frame_start, frame_end, servers):
def fetch_benchmark(server):
try:
benchmark = requests.get(f'http://{server["hostname"]}:{ZeroconfServer.server_port}'
f'/api/cpu_benchmark').text
server['cpu_benchmark'] = benchmark
logger.debug(f'Benchmark for {server["hostname"]}: {benchmark}')
except requests.exceptions.RequestException as e:
logger.error(f'Error fetching benchmark for {server["hostname"]}: {e}')
# Number of threads to use (can adjust based on your needs or number of servers)
threads = len(servers)
with ThreadPoolExecutor(max_workers=threads) as executor:
executor.map(fetch_benchmark, servers)
total_frames = frame_end - frame_start + 1
total_performance = sum(int(server['cpu_benchmark']) for server in servers)
frame_ranges = {}
current_frame = frame_start
allocated_frames = 0
for i, server in enumerate(servers):
if i == len(servers) - 1: # if it's the last server
# Give all remaining frames to the last server
num_frames = total_frames - allocated_frames
else:
num_frames = round((int(server['cpu_benchmark']) / total_performance) * total_frames)
allocated_frames += num_frames
frame_end_for_server = current_frame + num_frames - 1
if current_frame <= frame_end_for_server:
frame_ranges[server['hostname']] = (current_frame, frame_end_for_server)
current_frame = frame_end_for_server + 1
return frame_ranges
def divide_frames_equally(frame_start, frame_end, servers):
frame_range = frame_end - frame_start + 1
frames_per_server = frame_range // len(servers)
leftover_frames = frame_range % len(servers)
frame_ranges = {}
current_start = frame_start
for i, server in enumerate(servers):
current_end = current_start + frames_per_server - 1
if leftover_frames > 0:
current_end += 1
leftover_frames -= 1
if current_start <= current_end:
frame_ranges[server['hostname']] = (current_start, current_end)
current_start = current_end + 1
return frame_ranges
if len(available_servers) == 1:
breakdown = {available_servers[0]['hostname']: (start_frame, end_frame)}
else:
logger.debug(f'Splitting between {len(available_servers)} servers by {method} method')
if method == 'evenly':
breakdown = divide_frames_equally(start_frame, end_frame, available_servers)
elif method == 'cpu_benchmark':
breakdown = divide_frames_by_benchmark(start_frame, end_frame, available_servers)
elif method == 'cpu_count':
breakdown = divide_frames_by_cpu_count(start_frame, end_frame, available_servers)
else:
raise ValueError(f"Invalid distribution method: {method}")
server_breakdown = [server for server in available_servers if breakdown.get(server['hostname']) is not None]
for server in server_breakdown:
server['frame_range'] = breakdown[server['hostname']]
server['total_frames'] = breakdown[server['hostname']][-1] - breakdown[server['hostname']][0] + 1
return server_breakdown
+28 -12
View File
@@ -2,7 +2,8 @@ import logging
import socket import socket
from pubsub import pub from pubsub import pub
from zeroconf import Zeroconf, ServiceInfo, ServiceBrowser, ServiceStateChange, NonUniqueNameException from zeroconf import Zeroconf, ServiceInfo, ServiceBrowser, ServiceStateChange, NonUniqueNameException, \
NotRunningException
logger = logging.getLogger() logger = logging.getLogger()
@@ -22,17 +23,23 @@ class ZeroconfServer:
cls.service_type = service_type cls.service_type = service_type
cls.server_name = server_name cls.server_name = server_name
cls.server_port = server_port cls.server_port = server_port
try: # Stop any previously running instances
socket.gethostbyname(socket.gethostname())
except socket.gaierror:
cls.stop()
@classmethod @classmethod
def start(cls, listen_only=False): def start(cls, listen_only=False):
if not cls.service_type: if not cls.service_type:
raise RuntimeError("The 'configure' method must be run before starting the zeroconf server") raise RuntimeError("The 'configure' method must be run before starting the zeroconf server")
logger.debug("Starting zeroconf service")
if not listen_only: if not listen_only:
cls._register_service() cls._register_service()
cls._browse_services() cls._browse_services()
@classmethod @classmethod
def stop(cls): def stop(cls):
logger.debug("Stopping zeroconf service")
cls._unregister_service() cls._unregister_service()
cls.zeroconf.close() cls.zeroconf.close()
@@ -52,7 +59,7 @@ class ZeroconfServer:
cls.service_info = info cls.service_info = info
cls.zeroconf.register_service(info) cls.zeroconf.register_service(info)
logger.info(f"Registered zeroconf service: {cls.service_info.name}") logger.info(f"Registered zeroconf service: {cls.service_info.name}")
except NonUniqueNameException as e: except (NonUniqueNameException, socket.gaierror) as e:
logger.error(f"Error establishing zeroconf: {e}") logger.error(f"Error establishing zeroconf: {e}")
@classmethod @classmethod
@@ -69,40 +76,49 @@ class ZeroconfServer:
@classmethod @classmethod
def _on_service_discovered(cls, zeroconf, service_type, name, state_change): def _on_service_discovered(cls, zeroconf, service_type, name, state_change):
try:
info = zeroconf.get_service_info(service_type, name) info = zeroconf.get_service_info(service_type, name)
logger.debug(f"Zeroconf: {name} {state_change}") hostname = name.split(f'.{cls.service_type}')[0]
logger.debug(f"Zeroconf: {hostname} {state_change}")
if service_type == cls.service_type: if service_type == cls.service_type:
if state_change == ServiceStateChange.Added or state_change == ServiceStateChange.Updated: if state_change == ServiceStateChange.Added or state_change == ServiceStateChange.Updated:
cls.client_cache[name] = info cls.client_cache[hostname] = info
else: else:
cls.client_cache.pop(name) cls.client_cache.pop(hostname)
pub.sendMessage('zeroconf_state_change', hostname=name, state_change=state_change, info=info) pub.sendMessage('zeroconf_state_change', hostname=hostname, state_change=state_change)
except NotRunningException:
pass
@classmethod @classmethod
def found_hostnames(cls): def found_hostnames(cls):
fetched_hostnames = [x.split(f'.{cls.service_type}')[0] for x in cls.client_cache.keys()]
local_hostname = socket.gethostname() local_hostname = socket.gethostname()
# Define a sort key function
def sort_key(hostname): def sort_key(hostname):
# Return 0 if it's the local hostname so it comes first, else return 1 # Return 0 if it's the local hostname so it comes first, else return 1
return False if hostname == local_hostname else True return False if hostname == local_hostname else True
# Sort the list with the local hostname first # Sort the list with the local hostname first
sorted_hostnames = sorted(fetched_hostnames, key=sort_key) sorted_hostnames = sorted(cls.client_cache.keys(), key=sort_key)
return sorted_hostnames return sorted_hostnames
@classmethod @classmethod
def get_hostname_properties(cls, hostname): def get_hostname_properties(cls, hostname):
new_key = hostname + '.' + cls.service_type server_info = cls.client_cache.get(hostname).properties
server_info = cls.client_cache.get(new_key).properties
decoded_server_info = {key.decode('utf-8'): value.decode('utf-8') for key, value in server_info.items()} decoded_server_info = {key.decode('utf-8'): value.decode('utf-8') for key, value in server_info.items()}
return decoded_server_info return decoded_server_info
# Example usage: # Example usage:
if __name__ == "__main__": if __name__ == "__main__":
import time
logging.basicConfig(level=logging.DEBUG)
ZeroconfServer.configure("_zordon._tcp.local.", "foobar.local", 8080) ZeroconfServer.configure("_zordon._tcp.local.", "foobar.local", 8080)
try: try:
ZeroconfServer.start() ZeroconfServer.start()
input("Server running - Press enter to end") while True:
time.sleep(0.1)
except KeyboardInterrupt:
pass
finally: finally:
ZeroconfServer.stop() ZeroconfServer.stop()
Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.1 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.6 KiB

-64
View File
@@ -1,64 +0,0 @@
const grid = new gridjs.Grid({
columns: [
{ data: (row) => row.id,
name: 'Thumbnail',
formatter: (cell) => gridjs.html(`<img src="/api/job/${cell}/thumbnail?video_ok" style='width: 200px; min-width: 120px;'>`),
sort: {enabled: false}
},
{ id: 'name',
name: 'Name',
data: (row) => row.name,
formatter: (name, row) => gridjs.html(`<a href="/ui/job/${row.cells[0].data}/full_details">${name}</a>`)
},
{ id: 'renderer', data: (row) => `${row.renderer}-${row.renderer_version}`, name: 'Renderer' },
{ id: 'priority', name: 'Priority' },
{ id: 'status',
name: 'Status',
data: (row) => row,
formatter: (cell, row) => gridjs.html(`
<span class="tag ${(cell.status == 'running') ? 'is-hidden' : ''} ${(cell.status == 'cancelled') ?
'is-warning' : (cell.status == 'error') ? 'is-danger' : (cell.status == 'not_started') ?
'is-light' : 'is-primary'}">${cell.status}</span>
<progress class="progress is-primary ${(cell.status != 'running') ? 'is-hidden': ''}"
value="${(parseFloat(cell.percent_complete) * 100.0)}" max="100">${cell.status}</progress>
`)},
{ id: 'time_elapsed', name: 'Time Elapsed' },
{ data: (row) => row.total_frames ?? 'N/A', name: 'Frame Count' },
{ id: 'client', name: 'Client'},
{ data: (row) => row.last_output ?? 'N/A',
name: 'Last Output',
formatter: (output, row) => gridjs.html(`<a href="/api/job/${row.cells[0].data}/logs">${output}</a>`)
},
{ data: (row) => row,
name: 'Commands',
formatter: (cell, row) => gridjs.html(`
<div class="field has-addons" style='white-space: nowrap; display: inline-block;'>
<button class="button is-info" onclick="window.location.href='/ui/job/${row.cells[0].data}/full_details';">
<span class="icon"><i class="fa-solid fa-info"></i></span>
</button>
<button class="button is-link" onclick="window.location.href='/api/job/${row.cells[0].data}/logs';">
<span class="icon"><i class="fa-regular fa-file-lines"></i></span>
</button>
<button class="button is-warning is-active ${(cell.status != 'running') ? 'is-hidden': ''}" onclick="window.location.href='/api/job/${row.cells[0].data}/cancel?confirm=True&redirect=True';">
<span class="icon"><i class="fa-solid fa-x"></i></span>
</button>
<button class="button is-success ${(cell.status != 'completed') ? 'is-hidden': ''}" onclick="window.location.href='/api/job/${row.cells[0].data}/download_all';">
<span class="icon"><i class="fa-solid fa-download"></i></span>
<span>${cell.file_count}</span>
</button>
<button class="button is-danger" onclick="window.location.href='/api/job/${row.cells[0].data}/delete?confirm=True&redirect=True'">
<span class="icon"><i class="fa-regular fa-trash-can"></i></span>
</button>
</div>
`),
sort: false
},
{ id: 'owner', name: 'Owner' }
],
autoWidth: true,
server: {
url: '/api/jobs',
then: results => results['jobs'],
},
sort: true,
}).render(document.getElementById('table'));
-44
View File
@@ -1,44 +0,0 @@
document.addEventListener('DOMContentLoaded', () => {
// Functions to open and close a modal
function openModal($el) {
$el.classList.add('is-active');
}
function closeModal($el) {
$el.classList.remove('is-active');
}
function closeAllModals() {
(document.querySelectorAll('.modal') || []).forEach(($modal) => {
closeModal($modal);
});
}
// Add a click event on buttons to open a specific modal
(document.querySelectorAll('.js-modal-trigger') || []).forEach(($trigger) => {
const modal = $trigger.dataset.target;
const $target = document.getElementById(modal);
$trigger.addEventListener('click', () => {
openModal($target);
});
});
// Add a click event on various child elements to close the parent modal
(document.querySelectorAll('.modal-background, .modal-close, .modal-card-head .delete, .modal-card-foot .button') || []).forEach(($close) => {
const $target = $close.closest('.modal');
$close.addEventListener('click', () => {
closeModal($target);
});
});
// Add a keyboard event to close all modals
document.addEventListener('keydown', (event) => {
const e = event || window.event;
if (e.keyCode === 27) { // Escape key
closeAllModals();
}
});
});
-48
View File
@@ -1,48 +0,0 @@
{% extends 'layout.html' %}
{% block body %}
<div class="container" style="text-align:center; width: 100%">
<br>
{% if media_url: %}
<video width="1280" height="720" controls>
<source src="{{media_url}}" type="video/mp4">
Your browser does not support the video tag.
</video>
{% elif job_status == 'Running': %}
<div style="width: 100%; height: 720px; position: relative; background: black; text-align: center; color: white;">
<img src="/static/images/gears.png" style="vertical-align: middle; width: auto; height: auto; position:absolute; margin: auto; top: 0; bottom: 0; left: 0; right: 0;">
<span style="height: auto; position:absolute; margin: auto; top: 58%; left: 0; right: 0; color: white; width: 60%">
<progress class="progress is-primary" value="{{job.worker_data()['percent_complete'] * 100}}" max="100" style="margin-top: 6px;" id="progress-bar">Rendering</progress>
Rendering in Progress - <span id="percent-complete">{{(job.worker_data()['percent_complete'] * 100) | int}}%</span>
<br>Time Elapsed: <span id="time-elapsed">{{job.worker_data()['time_elapsed']}}</span>
</span>
<script>
var startingStatus = '{{job.status.value}}';
function update_job() {
$.getJSON('/api/job/{{job.id}}', function(data) {
document.getElementById('progress-bar').value = (data.percent_complete * 100);
document.getElementById('percent-complete').innerHTML = (data.percent_complete * 100).toFixed(0) + '%';
document.getElementById('time-elapsed').innerHTML = data.time_elapsed;
if (data.status != startingStatus){
clearInterval(renderingTimer);
window.location.reload(true);
};
});
}
if (startingStatus == 'running'){
var renderingTimer = setInterval(update_job, 1000);
};
</script>
</div>
{% else %}
<div style="width: 100%; height: 720px; position: relative; background: black;">
<img src="/static/images/{{job_status}}.png" style="vertical-align: middle; width: auto; height: auto; position:absolute; margin: auto; top: 0; bottom: 0; left: 0; right: 0;">
<span style="height: auto; position:absolute; margin: auto; top: 58%; left: 0; right: 0; color: white;">
{{job_status}}
</span>
</div>
{% endif %}
<br>
{{detail_table|safe}}
</div>
{% endblock %}
-8
View File
@@ -1,8 +0,0 @@
{% extends 'layout.html' %}
{% block body %}
<div class="container is-fluid" style="padding-top: 20px;">
<div id="table" class="table"></div>
</div>
<script src="/static/js/job_table.js"></script>
{% endblock %}
-236
View File
@@ -1,236 +0,0 @@
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>Zordon Dashboard</title>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bulma@0.9.4/css/bulma.min.css">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.7.0/css/font-awesome.min.css">
<script src="https://cdn.jsdelivr.net/npm/jquery/dist/jquery.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/gridjs/dist/gridjs.umd.js"></script>
<link href="https://unpkg.com/gridjs/dist/theme/mermaid.min.css" rel="stylesheet" />
<script src="https://kit.fontawesome.com/698705d14d.js" crossorigin="anonymous"></script>
<script type="text/javascript" src="/static/js/modals.js"></script>
</head>
<body onload="rendererChanged(document.getElementById('renderer'))">
<nav class="navbar is-dark" role="navigation" aria-label="main navigation">
<div class="navbar-brand">
<a class="navbar-item" href="/">
<img src="/static/images/logo.png">
</a>
</div>
<div id="navbarBasicExample" class="navbar-menu">
<div class="navbar-start">
<a class="navbar-item" href="/">
Home
</a>
</div>
<div class="navbar-end">
<div class="navbar-item">
<button class="button is-primary js-modal-trigger" data-target="add-job-modal">
<span class="icon">
<i class="fa-solid fa-upload"></i>
</span>
<span>Submit Job</span>
</button>
</div>
</div>
</div>
</nav>
{% block body %}
{% endblock %}
<div id="add-job-modal" class="modal">
<!-- Start Add Form -->
<form id="submit_job" action="/api/add_job?redirect=True" method="POST" enctype="multipart/form-data">
<div class="modal-background"></div>
<div class="modal-card">
<header class="modal-card-head">
<p class="modal-card-title">Submit New Job</p>
<button class="delete" aria-label="close" type="button"></button>
</header>
<section class="modal-card-body">
<!-- File Uploader -->
<label class="label">Upload File</label>
<div id="file-uploader" class="file has-name is-fullwidth">
<label class="file-label">
<input class="file-input is-small" type="file" name="file">
<span class="file-cta">
<span class="file-icon">
<i class="fas fa-upload"></i>
</span>
<span class="file-label">
Choose a file…
</span>
</span>
<span class="file-name">
No File Uploaded
</span>
</label>
</div>
<br>
<script>
const fileInput = document.querySelector('#file-uploader input[type=file]');
fileInput.onchange = () => {
if (fileInput.files.length > 0) {
const fileName = document.querySelector('#file-uploader .file-name');
fileName.textContent = fileInput.files[0].name;
}
}
const presets = {
{% for preset in preset_list: %}
{{preset}}: {
name: '{{preset_list[preset]['name']}}',
renderer: '{{preset_list[preset]['renderer']}}',
args: '{{preset_list[preset]['args']}}',
},
{% endfor %}
};
function rendererChanged(ddl1) {
var renderers = {
{% for renderer in renderer_info: %}
{% if renderer_info[renderer]['supported_export_formats']: %}
{{renderer}}: [
{% for format in renderer_info[renderer]['supported_export_formats']: %}
'{{format}}',
{% endfor %}
],
{% endif %}
{% endfor %}
};
var selectedRenderer = ddl1.value;
var ddl3 = document.getElementById('preset_list');
ddl3.options.length = 0;
createOption(ddl3, '-Presets-', '');
for (var preset_name in presets) {
if (presets[preset_name]['renderer'] == selectedRenderer) {
createOption(ddl3, presets[preset_name]['name'], preset_name);
};
};
document.getElementById('raw_args').value = "";
var ddl2 = document.getElementById('export_format');
ddl2.options.length = 0;
var options = renderers[selectedRenderer];
for (i = 0; i < options.length; i++) {
createOption(ddl2, options[i], options[i]);
};
}
function createOption(ddl, text, value) {
var opt = document.createElement('option');
opt.value = value;
opt.text = text;
ddl.options.add(opt);
}
function addPresetTextToInput(presetfield, textfield) {
var p = presets[presetfield.value];
textfield.value = p['args'];
}
</script>
<!-- Renderer & Priority -->
<div class="field is-grouped">
<p class="control">
<label class="label">Renderer</label>
<span class="select">
<select id="renderer" name="renderer" onchange="rendererChanged(this)">
{% for renderer in renderer_info: %}
<option name="renderer" value="{{renderer}}">{{renderer}}</option>
{% endfor %}
</select>
</span>
</p>
<p class="control">
<label class="label">Client</label>
<span class="select">
<select name="client">
<option name="client" value="">First Available</option>
{% for client in render_clients: %}
<option name="client" value="{{client}}">{{client}}</option>
{% endfor %}
</select>
</span>
</p>
<p class="control">
<label class="label">Priority</label>
<span class="select">
<select name="priority">
<option name="priority" value="1">1</option>
<option name="priority" value="2" selected="selected">2</option>
<option name="priority" value="3">3</option>
</select>
</span>
</p>
</div>
<!-- Output Path -->
<label class="label">Output</label>
<div class="field has-addons">
<div class="control is-expanded">
<input class="input is-small" type="text" placeholder="Output Name" name="output_path" value="output.mp4">
</div>
<p class="control">
<span class="select is-small">
<select id="export_format" name="export_format">
<option value="ar">option</option>
</select>
</span>
</p>
</div>
<!-- Resolution -->
<!-- <label class="label">Resolution</label>-->
<!-- <div class="field is-grouped">-->
<!-- <p class="control">-->
<!-- <input class="input" type="text" placeholder="auto" maxlength="5" size="8" name="AnyRenderer-arg_x_resolution">-->
<!-- </p>-->
<!-- <label class="label"> x </label>-->
<!-- <p class="control">-->
<!-- <input class="input" type="text" placeholder="auto" maxlength="5" size="8" name="AnyRenderer-arg_y_resolution">-->
<!-- </p>-->
<!-- <label class="label"> @ </label>-->
<!-- <p class="control">-->
<!-- <input class="input" type="text" placeholder="auto" maxlength="3" size="5" name="AnyRenderer-arg_frame_rate">-->
<!-- </p>-->
<!-- <label class="label"> fps </label>-->
<!-- </div>-->
<label class="label">Command Line Arguments</label>
<div class="field has-addons">
<p class="control">
<span class="select is-small">
<select id="preset_list" onchange="addPresetTextToInput(this, document.getElementById('raw_args'))">
<option value="preset-placeholder">presets</option>
</select>
</span>
</p>
<p class="control is-expanded">
<input class="input is-small" type="text" placeholder="Args" id="raw_args" name="raw_args">
</p>
</div>
<!-- End Add Form -->
</section>
<footer class="modal-card-foot">
<input class="button is-link" type="submit"/>
<button class="button" type="button">Cancel</button>
</footer>
</div>
</form>
</div>
</body>
</html>
-62
View File
@@ -1,62 +0,0 @@
<html>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<script>
$(function() {
$('#renderer').change(function() {
$('.render_settings').hide();
$('#' + $(this).val()).show();
});
});
</script>
<body>
<h3>Upload a file</h3>
<div>
<form action="/add_job" method="POST"
enctype="multipart/form-data">
<div>
<input type="file" name="file"/><br>
</div>
<input type="hidden" id="origin" name="origin" value="html">
<div id="client">
Render Client:
<select name="client">
{% for client in render_clients %}
<option value="{{client}}">{{client}}</option>
{% endfor %}
</select>
</div>
<div id="priority">
Priority:
<select name="priority">
<option value="1">1</option>
<option value="2" selected>2</option>
<option value="3">3</option>
</select>
</div>
<div>
<label for="renderer">Renderer:</label>
<select id="renderer" name="renderer">
{% for renderer in supported_renderers %}
<option value="{{renderer}}">{{renderer}}</option>
{% endfor %}
</select>
</div>
<div id="blender" class="render_settings" style="display:none">
Engine:
<select name="blender+engine">
<option value="CYCLES">Cycles</option>
<option value="BLENDER_EEVEE">Eevee</option>
</select>
</div>
<br>
<input type="submit"/>
</form>
</div>
</body>
</html>
+8
View File
@@ -0,0 +1,8 @@
APP_NAME = "Zordon"
APP_VERSION = "0.0.1"
APP_AUTHOR = "Brett Williams"
APP_DESCRIPTION = "Distributed Render Farm Tools"
APP_COPYRIGHT_YEAR = "2024"
APP_LICENSE = "MIT License"
APP_REPO_NAME = APP_NAME
APP_REPO_OWNER = "blw1138"