Owr actions update (#19)
* Go for Broke * Let it fire * Add PipLine * Create the dir if it doesn't exist * Install Setuptools * Track Test Action's files * Fix Calling Job * Track Build Action files * Install Distutils, rename filenames * Fix Fail conditions * Make Build scripts smarter * Add file * Concat DLLs lists * Try to fail if Error DLLs * Try to make the fail smarter * Moar verbosity * Print the stuff first * Print outputs objects * See if this skips failure * Use py instead * Print error list * Don't ValueError * Try checking a different way * Try something else * Bleh, spell filename correctly * Update excluded_dlls.json * Ugh, gotta compare old to new somehow * Compare to old list * Condense build script * Moar verbosity * Update the global version * Update Excluded DLLs list * Actually use the bad DLLs list * Make a version number * Fix version number building * Fix version number building again * Fix Diagnostics * Try REST API stuff * Try REST API again * Moar REST * await * Get SHA * Try it all together * Del test workflow * Add Perms * Use a Token * Try this Token * Try different Token * Try different Token * Create App Version earlier * See this error again * Don't fail if App Version not made yet * Use New Secret * Print whole response * Documentation for Tagger * Update CI Instructions * Update CI * List References * Find latest tag Fix App Version getter * Fix commas * Check returned data * Update Build Script * Fix substring * Fix Git tag * Fix tag again * Visual indicators * Use encoding * Remove an indicator * Update CI * Update Project Name * PyInstaller Spec Template file * Update Build Script * Fix Tagger * Update CI * Download AppVersion during build * Test job can fail * Upload Logs instead of printing them * Change from Reusable Workflow to Action * Change ref to token * Compare to string * Use PAT * Use String literal * Remove Reusable Workflow * Update CI Scripts * Go for Broke * Let it fire * Add PipLine * Create the dir if it doesn't exist * Install Setuptools * Track Test Action's files * Fix Calling Job * Track Build Action files * Install Distutils, rename filenames * Fix Fail conditions * Make Build scripts smarter * Add file * Concat DLLs lists * Try to fail if Error DLLs * Try to make the fail smarter * Moar verbosity * Print the stuff first * Print outputs objects * See if this skips failure * Use py instead * Print error list * Don't ValueError * Try checking a different way * Try something else * Bleh, spell filename correctly * Update excluded_dlls.json * Ugh, gotta compare old to new somehow * Compare to old list * Condense build script * Moar verbosity * Update the global version * Update Excluded DLLs list * Actually use the bad DLLs list * Make a version number * Fix version number building * Fix version number building again * Fix Diagnostics * Try REST API stuff * Try REST API again * Moar REST * await * Get SHA * Try it all together * Del test workflow * Add Perms * Use a Token * Try this Token * Try different Token * Try different Token * Create App Version earlier * See this error again * Don't fail if App Version not made yet * Use New Secret * Print whole response * Documentation for Tagger * Update CI Instructions * Update CI * List References * Find latest tag Fix App Version getter * Fix commas * Check returned data * Update Build Script * Fix substring * Fix Git tag * Fix tag again * Visual indicators * Use encoding * Remove an indicator * Update CI * Update Project Name * PyInstaller Spec Template file * Update Build Script * Fix Tagger * Update CI * Download AppVersion during build * Test job can fail * Upload Logs instead of printing them * Change from Reusable Workflow to Action * Change ref to token * Compare to string * Use PAT * Use String literal * Remove Reusable Workflow * Update CI Scripts --------- Co-authored-by: Minnie A. Trethewey (Mike) <minnietrethewey@gmail.com>
This commit is contained in:
51
.github/actions/appversion-prepare/action.yml
vendored
Normal file
51
.github/actions/appversion-prepare/action.yml
vendored
Normal file
@@ -0,0 +1,51 @@
|
||||
name: Prepare AppVersion
|
||||
description: Prepare AppVersion document for later use
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
# checkout commit
|
||||
- name: Checkout commit
|
||||
shell: bash
|
||||
run: |
|
||||
echo "Checkout commit"
|
||||
- name: Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
|
||||
# Set Run Number
|
||||
- name: Set Run Number
|
||||
shell: bash
|
||||
run: |
|
||||
echo "Set Run Number"
|
||||
- name: Set Run Number
|
||||
id: set_run_number
|
||||
shell: bash
|
||||
run: |
|
||||
GITHUB_RUN_NUMBER="${{ github.run_number }}a${{ github.run_attempt }}"
|
||||
echo "github_run_number=$GITHUB_RUN_NUMBER" >> $GITHUB_OUTPUT
|
||||
|
||||
# Prepare AppVersion
|
||||
#TODO: source/classes/appversion.py writes the tag format
|
||||
- name: 💬Prepare AppVersion
|
||||
shell: bash
|
||||
run: |
|
||||
echo "💬Prepare AppVersion"
|
||||
- name: Prepare AppVersion
|
||||
shell: bash
|
||||
env:
|
||||
OS_NAME: ${{ inputs.os-name }}
|
||||
GITHUB_RUN_NUMBER: ${{ steps.set_run_number.outputs.github_run_number }}
|
||||
run: |
|
||||
python -m source.classes.appversion
|
||||
python ./resources/ci/common/prepare_appversion.py
|
||||
|
||||
# upload appversion artifact for later step
|
||||
- name: 🔼Upload AppVersion Artifact
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🔼Upload AppVersion Artifact"
|
||||
- name: 🔼Upload AppVersion Artifact
|
||||
uses: actions/upload-artifact@v4.3.3
|
||||
with:
|
||||
name: appversion
|
||||
path: ./resources/app/meta/manifests/app_version.txt
|
||||
88
.github/actions/build/action.yml
vendored
Normal file
88
.github/actions/build/action.yml
vendored
Normal file
@@ -0,0 +1,88 @@
|
||||
name: Build
|
||||
description: Build app
|
||||
inputs:
|
||||
calling-job:
|
||||
required: true
|
||||
description: Job that's calling this one
|
||||
os-name:
|
||||
required: true
|
||||
description: OS to run on
|
||||
python-version:
|
||||
required: true
|
||||
description: Python version to install
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
# checkout commit
|
||||
- name: Checkout commit
|
||||
shell: bash
|
||||
run: |
|
||||
echo "Checkout commit"
|
||||
- name: Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
|
||||
# get parent dir
|
||||
- name: Get Parent Directory
|
||||
id: parentDir
|
||||
uses: ./.github/actions/get-parent-dir
|
||||
|
||||
# try to get UPX
|
||||
- name: Get UPX
|
||||
shell: bash
|
||||
run: |
|
||||
echo "Get UPX"
|
||||
- name: Get UPX
|
||||
shell: bash
|
||||
env:
|
||||
OS_NAME: ${{ inputs.os-name }}
|
||||
UPX_VERSION: "4.2.3"
|
||||
run: |
|
||||
python ./resources/ci/common/get_upx.py
|
||||
|
||||
# run build.py
|
||||
- name: 💬Build Binaries
|
||||
shell: bash
|
||||
run: |
|
||||
echo "💬Build Binaries"
|
||||
- name: Build Binaries
|
||||
shell: bash
|
||||
run: |
|
||||
pip install pyinstaller
|
||||
python -m source.meta.build
|
||||
|
||||
# upload problem children
|
||||
# - name: 🔼Upload Problem Children Artifact
|
||||
# shell: bash
|
||||
# run: |
|
||||
# echo "🔼Upload Problem Children Artifact"
|
||||
# - name: 🔼Upload Problem Children Artifact
|
||||
# uses: actions/upload-artifact@v4.3.3
|
||||
# with:
|
||||
# name: problemchildren-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
# path: ./resources/app/meta/manifests/excluded_dlls.json
|
||||
# if-no-files-found: ignore # 'warn' or 'ignore' are also available, defaults to `warn`
|
||||
|
||||
# prepare binary artifact for later step
|
||||
- name: 💬Prepare Binary Artifact
|
||||
shell: bash
|
||||
run: |
|
||||
echo "💬Prepare Binary Artifact"
|
||||
- name: Prepare Binary Artifact
|
||||
shell: bash
|
||||
env:
|
||||
OS_NAME: ${{ inputs.os-name }}
|
||||
run: |
|
||||
python ./resources/ci/common/prepare_binary.py
|
||||
|
||||
# upload binary artifact for later step
|
||||
- name: 🔼Upload Binary Artifact
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🔼Upload Binary Artifact"
|
||||
- name: 🔼Upload Binary Artifact
|
||||
uses: actions/upload-artifact@v4.3.3
|
||||
with:
|
||||
name: binary-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
path: ${{ steps.parentDir.outputs.parentDir }}/artifact
|
||||
if-no-files-found: error # 'warn' or 'ignore' are also available, defaults to `warn`
|
||||
41
.github/actions/get-parent-dir/action.yml
vendored
Normal file
41
.github/actions/get-parent-dir/action.yml
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
name: 📁Get Parent Directory
|
||||
description: Get Parent Directory
|
||||
|
||||
outputs:
|
||||
parentDirNotWin:
|
||||
description: "Parent Directory (!Windows)"
|
||||
value: ${{ steps.parentDirNotWin.outputs.value }}
|
||||
parentDir:
|
||||
description: "Parent Directory (Windows)"
|
||||
value: ${{ steps.parentDir.outputs.value }}
|
||||
|
||||
#########
|
||||
# actions
|
||||
#########
|
||||
# mad9000/actions-find-and-replace-string@5
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
# get parent directory
|
||||
- name: Get Repo Name
|
||||
uses: mad9000/actions-find-and-replace-string@5
|
||||
id: repoName
|
||||
with:
|
||||
source: ${{ github.repository }}
|
||||
find: "${{ github.repository_owner }}/"
|
||||
replace: ""
|
||||
- name: 📁Get Parent Directory Path (!Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@5
|
||||
id: parentDirNotWin
|
||||
with:
|
||||
source: ${{ github.workspace }}
|
||||
find: "${{ steps.repoName.outputs.value }}/${{ steps.repoName.outputs.value }}"
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
- name: 📁Get Parent Directory Path (Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@5
|
||||
id: parentDir
|
||||
with:
|
||||
source: ${{ steps.parentDirNotWin.outputs.value }}
|
||||
find: '${{ steps.repoName.outputs.value }}\${{ steps.repoName.outputs.value }}'
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
49
.github/actions/install/action.yml
vendored
Normal file
49
.github/actions/install/action.yml
vendored
Normal file
@@ -0,0 +1,49 @@
|
||||
name: 💿Install
|
||||
description: Install app
|
||||
inputs:
|
||||
calling-job:
|
||||
required: true
|
||||
description: Job that's calling this one
|
||||
os-name:
|
||||
required: true
|
||||
description: OS to run on
|
||||
python-version:
|
||||
required: true
|
||||
description: Python version to install
|
||||
|
||||
#########
|
||||
# actions
|
||||
#########
|
||||
# actions/checkout@v4.1.4
|
||||
# actions/setup-python@v5.1.0
|
||||
# actions/upload-artifact@v4.3.3
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
# install python
|
||||
- name: 💿Install Python
|
||||
uses: actions/setup-python@v5.1.0
|
||||
with:
|
||||
python-version: ${{ inputs.python-version }}
|
||||
# install modules via pip
|
||||
- name: 💿Install Modules
|
||||
shell: bash
|
||||
env:
|
||||
OS_NAME: ${{ inputs.os-name }}
|
||||
run: |
|
||||
echo "Install Modules"
|
||||
python ./resources/ci/common/get_pipline.py
|
||||
# print pipline
|
||||
- name: PipLine
|
||||
shell: bash
|
||||
run: |
|
||||
echo "PipLine"
|
||||
cat ./resources/user/meta/manifests/pipline.txt
|
||||
# upload pipline
|
||||
- name: 🔼Upload PipLine
|
||||
uses: actions/upload-artifact@v4.3.3
|
||||
with:
|
||||
name: pipline-${{ inputs.calling-job }}-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
path: ./resources/user/meta/manifests
|
||||
if: contains(inputs.calling-job, 'test')
|
||||
77
.github/actions/release-prepare/action.yml
vendored
Normal file
77
.github/actions/release-prepare/action.yml
vendored
Normal file
@@ -0,0 +1,77 @@
|
||||
name: 📀->📦Prepare Release
|
||||
description: Prepare Release for Deployment
|
||||
inputs:
|
||||
os-name:
|
||||
required: true
|
||||
description: OS to run on
|
||||
python-version:
|
||||
required: true
|
||||
description: Python version to install
|
||||
|
||||
#########
|
||||
# actions
|
||||
#########
|
||||
# Artheau/SpriteSomething/get-parent-dir
|
||||
# actions/checkout@v4.1.4
|
||||
# actions/download-artifact@v4.1.7
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
# checkout commit
|
||||
- name: ✔️Checkout commit
|
||||
shell: bash
|
||||
run: |
|
||||
echo "✔️Checkout commit"
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
|
||||
# get parent dir
|
||||
- name: 📁Get Parent Directory
|
||||
shell: bash
|
||||
run: |
|
||||
echo "📁Get Parent Directory"
|
||||
- name: 📁Get Parent Directory
|
||||
id: parentDir
|
||||
uses: ./.github/actions/get-parent-dir
|
||||
|
||||
# download binary artifact
|
||||
- name: 🔽Download Binary Artifact
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🔽Download Binary Artifact"
|
||||
- name: 🔽Download Binary Artifact
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
with:
|
||||
name: binary-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
path: ./
|
||||
|
||||
# download appversion artifact
|
||||
- name: 🔽Download AppVersion Artifact
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
with:
|
||||
name: appversion
|
||||
path: ${{ steps.parentDir.outputs.parentDir }}/build
|
||||
|
||||
# Prepare Release
|
||||
- name: 💬Prepare Release
|
||||
shell: bash
|
||||
run: |
|
||||
echo "💬Prepare Release"
|
||||
- name: Prepare Release
|
||||
shell: bash
|
||||
env:
|
||||
OS_NAME: ${{ inputs.os-name }}
|
||||
run: |
|
||||
python ./resources/ci/common/prepare_release.py
|
||||
|
||||
# upload archive artifact for later step
|
||||
- name: 🔼Upload Archive Artifact
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🔼Upload Archive Artifact"
|
||||
- name: 🔼Upload Archive Artifact
|
||||
uses: actions/upload-artifact@v4.3.3
|
||||
with:
|
||||
name: archive-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
path: ${{ steps.parentDir.outputs.parentDir }}/deploy
|
||||
76
.github/actions/tag-repo/action.yml
vendored
Normal file
76
.github/actions/tag-repo/action.yml
vendored
Normal file
@@ -0,0 +1,76 @@
|
||||
name: 🏷️Tag Repository
|
||||
description: Tag a repository
|
||||
|
||||
inputs:
|
||||
repository:
|
||||
description: "Repository Owner/Name; octocat/Hello-World"
|
||||
required: true
|
||||
ref-name:
|
||||
description: "Reference name; branch, tag, etc"
|
||||
required: true
|
||||
github-tag:
|
||||
description: "Reference to tag with"
|
||||
required: true
|
||||
debug:
|
||||
description: "Debug Mode, won't set tag"
|
||||
required: false
|
||||
default: "false"
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
- name: 🏷️Tag Repository
|
||||
uses: actions/github-script@v7.0.1
|
||||
with:
|
||||
github-token: ${{ env.FINE_PAT }}
|
||||
script: |
|
||||
const debug = ${{ inputs.debug }} == "true" || ${{ inputs.debug }} == true;
|
||||
const repository = '${{ inputs.repository }}';
|
||||
const owner = repository.substring(0,repository.indexOf('/'));
|
||||
const repo = repository.substring(repository.indexOf('/')+1);
|
||||
const ref = '${{ inputs.ref-name }}';
|
||||
// get git tag
|
||||
const gitTag = '${{ inputs.github-tag }}';
|
||||
console.log('Repo Data: ', `${owner}/${repo}@${ref}`)
|
||||
console.log('Git tag: ', gitTag)
|
||||
if(gitTag == '') {
|
||||
let msg = 'Result: 🔴No Git Tag sent, aborting!';
|
||||
console.log(msg)
|
||||
core.setFailed(msg)
|
||||
return
|
||||
}
|
||||
// get latest commit
|
||||
const latestCommit = await github.rest.git.getRef({
|
||||
owner: owner,
|
||||
repo: repo,
|
||||
ref: ref
|
||||
})
|
||||
// get latest refs
|
||||
const latestRefs = await github.rest.git.listMatchingRefs({
|
||||
owner: owner,
|
||||
repo: repo
|
||||
})
|
||||
let latestTag = ''; // bucket for latest tag
|
||||
// get last tag in data
|
||||
for(let thisRef of latestRefs.data) {
|
||||
if(thisRef['ref'].indexOf('tags') > -1) {
|
||||
let refParts = thisRef['ref'].split('/');
|
||||
latestTag = refParts[-1];
|
||||
}
|
||||
}
|
||||
console.log('Latest tag:', latestTag)
|
||||
if(latestTag != gitTag) {
|
||||
if(debug) {
|
||||
console.log(`DEBUG: 🔵Creating '${gitTag}' tag`)
|
||||
} else {
|
||||
console.log(`Result: 🟢Creating '${gitTag}' tag`)
|
||||
github.rest.git.createRef({
|
||||
owner: owner,
|
||||
repo: repo,
|
||||
ref: `refs/tags/${gitTag}`,
|
||||
sha: latestCommit.data.object.sha
|
||||
})
|
||||
}
|
||||
} else {
|
||||
console.log('Result: 🟡Not creating release tag')
|
||||
}
|
||||
97
.github/actions/test/action.yml
vendored
Normal file
97
.github/actions/test/action.yml
vendored
Normal file
@@ -0,0 +1,97 @@
|
||||
name: ⏱️Test
|
||||
description: Test app
|
||||
inputs:
|
||||
os-name:
|
||||
required: true
|
||||
description: OS to run on
|
||||
python-version:
|
||||
required: true
|
||||
description: Python version to install
|
||||
|
||||
#########
|
||||
# actions
|
||||
#########
|
||||
# actions/checkout@v4.1.4
|
||||
# actions/download-artifact@v4.1.7
|
||||
# actions/upload-artifact@v4.3.3
|
||||
# coactions/setup-xvfb@v1.0.1
|
||||
|
||||
runs:
|
||||
using: "composite"
|
||||
steps:
|
||||
# download pipline
|
||||
- name: 🔽Download PipLine
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🔽Download PipLine"
|
||||
- name: 🔽Download PipLine
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
with:
|
||||
name: pipline-test-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
path: ./resources/user/meta/manifests
|
||||
|
||||
# run tests
|
||||
- name: 🖥️Test Base
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🖥️Test Base"
|
||||
- name: 🖥️Test Base
|
||||
shell: bash
|
||||
run: |
|
||||
mkdir -p ./failures
|
||||
echo "" > ./failures/errors.txt
|
||||
python -m pip install tqdm
|
||||
python ./test/NewTestSuite.py
|
||||
# - name: 🖥️Test Mystery
|
||||
# shell: bash
|
||||
# run: |
|
||||
# echo "🖥️Test Mystery"
|
||||
# if: contains(inputs.os-name, 'macos')
|
||||
# - name: 🖥️Test Mystery
|
||||
# shell: bash
|
||||
# run: |
|
||||
# python ./test/MysteryTestSuite.py
|
||||
# if: contains(inputs.os-name, 'macos')
|
||||
|
||||
# upload logs
|
||||
- name: 🔼Upload Logs
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🔼Upload Logs"
|
||||
- name: 🔼Upload Logs
|
||||
uses: actions/upload-artifact@v4.3.3
|
||||
with:
|
||||
name: logs-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
path: ./logs
|
||||
if-no-files-found: ignore
|
||||
|
||||
# print failures
|
||||
- name: 💬Print Failures
|
||||
if: failure()
|
||||
shell: bash
|
||||
run: |
|
||||
echo "💬Print Failures"
|
||||
- name: Print Failures
|
||||
if: failure()
|
||||
shell: bash
|
||||
run: |
|
||||
ERR_STRING="$(cat ./failures/errors.txt)"
|
||||
ERR_STRING="${ERR_STRING//'%'/'%25'}"
|
||||
ERR_STRING="${ERR_STRING//$'\n'/' | '}"
|
||||
ERR_STRING="${ERR_STRING//$'\r'/' | '}"
|
||||
ERR_STRING="${ERR_STRING//$'\n'/'%0A'}"
|
||||
ERR_STRING="${ERR_STRING//$'\r'/'%0D'}"
|
||||
echo "::error ::$ERR_STRING"
|
||||
|
||||
# upload failures
|
||||
- name: 🔼Upload Failures
|
||||
if: failure()
|
||||
shell: bash
|
||||
run: |
|
||||
echo "🔼Upload Failures"
|
||||
- name: 🔼Upload Failures
|
||||
if: failure()
|
||||
uses: actions/upload-artifact@v4.3.3
|
||||
with:
|
||||
name: failures-${{ inputs.os-name }}-py${{ inputs.python-version }}
|
||||
path: ./failures
|
||||
316
.github/workflows/ci.yml
vendored
316
.github/workflows/ci.yml
vendored
@@ -1,316 +0,0 @@
|
||||
# workflow name
|
||||
name: Build
|
||||
|
||||
# fire on
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- DoorDev
|
||||
pull_request:
|
||||
branches:
|
||||
- DoorDev
|
||||
|
||||
# stuff to do
|
||||
jobs:
|
||||
# Install & Build
|
||||
# Set up environment
|
||||
# Build
|
||||
# Run build-gui.py
|
||||
# Run build-dr.py
|
||||
install-build:
|
||||
name: Install/Build
|
||||
# cycle through os list
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
|
||||
# VM settings
|
||||
# os & python versions
|
||||
strategy:
|
||||
matrix:
|
||||
os-name: [ ubuntu-latest, ubuntu-20.04, macOS-latest, windows-latest ]
|
||||
python-version: [ 3.9 ]
|
||||
# needs: [ install-test ]
|
||||
steps:
|
||||
# checkout commit
|
||||
- name: Checkout commit
|
||||
uses: actions/checkout@v3
|
||||
# install python
|
||||
- name: Install python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
architecture: "x64"
|
||||
- run: |
|
||||
python --version
|
||||
# install dependencies via pip
|
||||
- name: Install dependencies via pip
|
||||
env:
|
||||
OS_NAME: ${{ matrix.os-name }}
|
||||
run: |
|
||||
python ./resources/ci/common/install.py
|
||||
pip install pyinstaller
|
||||
# get parent directory
|
||||
- name: Get Repo Name
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: repoName
|
||||
with:
|
||||
source: ${{ github.repository }}
|
||||
find: '${{ github.repository_owner }}/'
|
||||
replace: ''
|
||||
- name: Get Parent Directory Path (!Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: parentDirNotWin
|
||||
with:
|
||||
source: ${{ github.workspace }}
|
||||
find: '${{ steps.repoName.outputs.value }}/${{ steps.repoName.outputs.value }}'
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
- name: Get Parent Directory Path (Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: parentDir
|
||||
with:
|
||||
source: ${{ steps.parentDirNotWin.outputs.value }}
|
||||
find: '${{ steps.repoName.outputs.value }}\${{ steps.repoName.outputs.value }}'
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
# try to get UPX
|
||||
- name: Get UPX
|
||||
env:
|
||||
OS_NAME: ${{ matrix.os-name }}
|
||||
run: |
|
||||
python ./resources/ci/common/get_upx.py
|
||||
# run build-gui.py
|
||||
- name: Build GUI
|
||||
run: |
|
||||
python ./source/meta/build-gui.py
|
||||
# run build-dr.py
|
||||
- name: Build DungeonRandomizer
|
||||
run: |
|
||||
python ./source/meta/build-dr.py
|
||||
# prepare binary artifacts for later step
|
||||
- name: Prepare Binary Artifacts
|
||||
env:
|
||||
OS_NAME: ${{ matrix.os-name }}
|
||||
run: |
|
||||
python ./resources/ci/common/prepare_binary.py
|
||||
# upload binary artifacts for later step
|
||||
- name: Upload Binary Artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: binaries-${{ matrix.os-name }}
|
||||
path: ${{ steps.parentDir.outputs.value }}/artifact
|
||||
|
||||
# Install & Preparing Release
|
||||
# Set up environment
|
||||
# Local Prepare Release action
|
||||
install-prepare-release:
|
||||
name: Install/Prepare Release
|
||||
# cycle through os list
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
|
||||
# VM settings
|
||||
# os & python versions
|
||||
strategy:
|
||||
matrix:
|
||||
# install/release on not bionic
|
||||
os-name: [ ubuntu-latest, ubuntu-20.04, macOS-latest, windows-latest ]
|
||||
python-version: [ 3.9 ]
|
||||
|
||||
needs: [ install-build ]
|
||||
steps:
|
||||
# checkout commit
|
||||
- name: Checkout commit
|
||||
uses: actions/checkout@v3
|
||||
# install python
|
||||
- name: Install Python
|
||||
uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
architecture: "x64"
|
||||
- run: |
|
||||
python --version
|
||||
# install dependencies via pip
|
||||
- name: Install Dependencies via pip
|
||||
env:
|
||||
OS_NAME: ${{ matrix.os-name }}
|
||||
run: |
|
||||
python ./resources/ci/common/install.py
|
||||
# get parent directory
|
||||
- name: Get Repo Name
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: repoName
|
||||
with:
|
||||
source: ${{ github.repository }}
|
||||
find: '${{ github.repository_owner }}/'
|
||||
replace: ''
|
||||
- name: Get Parent Directory Path (!Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: parentDirNotWin
|
||||
with:
|
||||
source: ${{ github.workspace }}
|
||||
find: '${{ steps.repoName.outputs.value }}/${{ steps.repoName.outputs.value }}'
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
- name: Get Parent Directory Path (Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: parentDir
|
||||
with:
|
||||
source: ${{ steps.parentDirNotWin.outputs.value }}
|
||||
find: '${{ steps.repoName.outputs.value }}\${{ steps.repoName.outputs.value }}'
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
# download binary artifact
|
||||
- name: Download Binary Artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: binaries-${{ matrix.os-name }}
|
||||
path: ./
|
||||
# Prepare AppVersion & Release
|
||||
- name: Prepare AppVersion & Release
|
||||
env:
|
||||
OS_NAME: ${{ matrix.os-name }}
|
||||
run: |
|
||||
python ./build-app_version.py
|
||||
python ./resources/ci/common/prepare_appversion.py
|
||||
python ./resources/ci/common/prepare_release.py
|
||||
# upload appversion artifact for later step
|
||||
- name: Upload AppVersion Artifact
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: appversion-${{ matrix.os-name }}
|
||||
path: ./resources/app/meta/manifests/app_version.txt
|
||||
# upload archive artifact for later step
|
||||
- name: Upload Archive Artifact
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: archive-${{ matrix.os-name }}
|
||||
path: ${{ steps.parentDir.outputs.value }}/deploy
|
||||
|
||||
# Deploy to GitHub Releases
|
||||
# Release Name: ALttPDoorRandomizer v${GITHUB_TAG}
|
||||
# Release Body: Inline content of RELEASENOTES.md
|
||||
# Release Body: Fallback to URL to RELEASENOTES.md
|
||||
# Release Files: ${{ steps.parentDir.outputs.value }}/deploy
|
||||
deploy-release:
|
||||
name: Deploy GHReleases
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
|
||||
# VM settings
|
||||
# os & python versions
|
||||
strategy:
|
||||
matrix:
|
||||
# release only on focal
|
||||
os-name: [ ubuntu-latest ]
|
||||
python-version: [ 3.9 ]
|
||||
|
||||
needs: [ install-prepare-release ]
|
||||
steps:
|
||||
# checkout commit
|
||||
- name: Checkout commit
|
||||
uses: actions/checkout@v3
|
||||
# get parent directory
|
||||
- name: Get Repo Name
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: repoName
|
||||
with:
|
||||
source: ${{ github.repository }}
|
||||
find: '${{ github.repository_owner }}/'
|
||||
replace: ''
|
||||
- name: Get Parent Directory Path (!Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: parentDirNotWin
|
||||
with:
|
||||
source: ${{ github.workspace }}
|
||||
find: '${{ steps.repoName.outputs.value }}/${{ steps.repoName.outputs.value }}'
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
- name: Get Parent Directory Path (Windows)
|
||||
uses: mad9000/actions-find-and-replace-string@3
|
||||
id: parentDir
|
||||
with:
|
||||
source: ${{ steps.parentDirNotWin.outputs.value }}
|
||||
find: '${{ steps.repoName.outputs.value }}\${{ steps.repoName.outputs.value }}'
|
||||
replace: ${{ steps.repoName.outputs.value }}
|
||||
- name: Install Dependencies via pip
|
||||
run: |
|
||||
python -m pip install pytz requests
|
||||
# download appversion artifact
|
||||
- name: Download AppVersion Artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: appversion-${{ matrix.os-name }}
|
||||
path: ${{ steps.parentDir.outputs.value }}/build
|
||||
# download ubuntu archive artifact
|
||||
- name: Download Ubuntu Archive Artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: archive-ubuntu-latest
|
||||
path: ${{ steps.parentDir.outputs.value }}/deploy/linux
|
||||
# download macos archive artifact
|
||||
- name: Download MacOS Archive Artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: archive-macOS-latest
|
||||
path: ${{ steps.parentDir.outputs.value }}/deploy/macos
|
||||
# download windows archive artifact
|
||||
- name: Download Windows Archive Artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: archive-windows-latest
|
||||
path: ${{ steps.parentDir.outputs.value }}/deploy/windows
|
||||
# debug info
|
||||
- name: Debug Info
|
||||
id: debug_info
|
||||
# shell: bash
|
||||
# git tag ${GITHUB_TAG}
|
||||
# git push origin ${GITHUB_TAG}
|
||||
run: |
|
||||
GITHUB_TAG="$(head -n 1 ../build/app_version.txt)"
|
||||
echo "::set-output name=github_tag::$GITHUB_TAG"
|
||||
GITHUB_TAG="v${GITHUB_TAG}"
|
||||
RELEASE_NAME="ALttPDoorRandomizer ${GITHUB_TAG}"
|
||||
echo "Release Name: ${RELEASE_NAME}"
|
||||
echo "Git Tag: ${GITHUB_TAG}"
|
||||
# create a pre/release
|
||||
- name: Create a Pre/Release
|
||||
id: create_release
|
||||
uses: actions/create-release@v1.1.4
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
tag_name: v${{ steps.debug_info.outputs.github_tag }}
|
||||
release_name: ALttPDoorRandomizer v${{ steps.debug_info.outputs.github_tag }}
|
||||
body_path: RELEASENOTES.md
|
||||
draft: true
|
||||
prerelease: true
|
||||
if: contains(github.ref, 'master') || contains(github.ref, 'stable') || contains(github.ref, 'dev') || contains(github.ref, 'DoorRelease')
|
||||
# upload linux archive asset
|
||||
- name: Upload Linux Archive Asset
|
||||
id: upload-linux-asset
|
||||
uses: actions/upload-release-asset@v1.0.2
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ../deploy/linux/ALttPDoorRandomizer.tar.gz
|
||||
asset_name: ALttPDoorRandomizer-${{ steps.debug_info.outputs.github_tag }}-linux-focal.tar.gz
|
||||
asset_content_type: application/gzip
|
||||
if: contains(github.ref, 'master') || contains(github.ref, 'stable') || contains(github.ref, 'dev') || contains(github.ref, 'DoorRelease')
|
||||
# upload macos archive asset
|
||||
- name: Upload MacOS Archive Asset
|
||||
id: upload-macos-asset
|
||||
uses: actions/upload-release-asset@v1.0.2
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ../deploy/macos/ALttPDoorRandomizer.tar.gz
|
||||
asset_name: ALttPDoorRandomizer-${{ steps.debug_info.outputs.github_tag }}-osx.tar.gz
|
||||
asset_content_type: application/gzip
|
||||
if: contains(github.ref, 'master') || contains(github.ref, 'stable') || contains(github.ref, 'dev') || contains(github.ref, 'DoorRelease')
|
||||
# upload windows archive asset
|
||||
- name: Upload Windows Archive Asset
|
||||
id: upload-windows-asset
|
||||
uses: actions/upload-release-asset@v1.0.2
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ../deploy/windows/ALttPDoorRandomizer.zip
|
||||
asset_name: ALttPDoorRandomizer-${{ steps.debug_info.outputs.github_tag }}-windows.zip
|
||||
asset_content_type: application/zip
|
||||
if: contains(github.ref, 'master') || contains(github.ref, 'stable') || contains(github.ref, 'dev') || contains(github.ref, 'DoorRelease')
|
||||
47
.github/workflows/release-complete.yml
vendored
Normal file
47
.github/workflows/release-complete.yml
vendored
Normal file
@@ -0,0 +1,47 @@
|
||||
# workflow name
|
||||
name: 🏷️Tag Repositories
|
||||
|
||||
# Fine-grained personal access token
|
||||
# https://github.com/settings/tokens?type=beta
|
||||
# token needs perms:
|
||||
# actions: read/write
|
||||
# commit statuses: read/write
|
||||
# contents: read/write
|
||||
# workflows: read/write
|
||||
# copy token
|
||||
# Actions secrets and variables
|
||||
# github.com/<owner>/<repo>/settings/secrets/actions
|
||||
# repository secret
|
||||
# name a new secret "ALTTPER_TAGGER"
|
||||
# value set to copied token
|
||||
|
||||
# fire on
|
||||
on:
|
||||
release:
|
||||
types:
|
||||
- released
|
||||
|
||||
jobs:
|
||||
# Tag Baserom
|
||||
tag-baserom:
|
||||
name: 🖳Tag Baserom
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
strategy:
|
||||
matrix:
|
||||
os-name: [
|
||||
# ubuntu-latest
|
||||
"ubuntu-22.04"
|
||||
]
|
||||
|
||||
steps:
|
||||
# call checkout
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
- name: 🏷️Tag Repository
|
||||
uses: ./.github/actions/tag-repo
|
||||
env:
|
||||
FINE_PAT: ${{ secrets.ALTTPER_TAGGER }}
|
||||
with:
|
||||
repository: ${{ github.repository_owner }}/z3randomizer
|
||||
ref-name: heads/OWMain
|
||||
github-tag: ${{ github.event.release.tag_name }}
|
||||
418
.github/workflows/release-create.yml
vendored
Normal file
418
.github/workflows/release-create.yml
vendored
Normal file
@@ -0,0 +1,418 @@
|
||||
# workflow name
|
||||
name: ⏱️Test/🔨Build/🚀Deploy
|
||||
|
||||
# fire on
|
||||
on: [
|
||||
push,
|
||||
pull_request
|
||||
]
|
||||
|
||||
# on:
|
||||
# push:
|
||||
# branches:
|
||||
# - DoorDevUnstable
|
||||
# - DoorDev
|
||||
# - OverworldShuffleDev
|
||||
# - OverworldShuffle
|
||||
# pull_request:
|
||||
# branches:
|
||||
# - DoorDevUnstable
|
||||
# - DoorDev
|
||||
# - OverworldShuffleDev
|
||||
# - OverworldShuffle
|
||||
|
||||
# stuff to do
|
||||
jobs:
|
||||
# Diagnostics
|
||||
diags:
|
||||
# diagnostics
|
||||
# call checkout
|
||||
# call install python
|
||||
# print python version
|
||||
# call install
|
||||
# call analyze github actions
|
||||
# install extra python modules
|
||||
# run diagnostics
|
||||
name: 🧮
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
continue-on-error: True
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
#TODO: OS List to run on
|
||||
os-name: [
|
||||
# ubuntu-latest, # ubuntu-22.04
|
||||
ubuntu-22.04,
|
||||
ubuntu-20.04,
|
||||
macos-latest, # macos-12
|
||||
windows-latest # windows-2022
|
||||
]
|
||||
#TODO: Python Version to run on
|
||||
python-version: [ "3.12" ]
|
||||
steps:
|
||||
# call checkout
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
# call install python
|
||||
- name: 💿Install Python
|
||||
uses: actions/setup-python@v5.1.0
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
# print python version
|
||||
- name: 🐍Python Version
|
||||
shell: bash
|
||||
run: |
|
||||
python --version
|
||||
# call install
|
||||
- name: 💿Call Install
|
||||
uses: ./.github/actions/install
|
||||
with:
|
||||
calling-job: diags
|
||||
os-name: ${{ matrix.os-name }}
|
||||
python-version: ${{ matrix.python-version }}
|
||||
# call analyze github actions
|
||||
- name: ⚙️Analyze used GitHub Actions
|
||||
shell: bash
|
||||
run: |
|
||||
python ./resources/ci/common/list_actions.py
|
||||
# install extra python modules
|
||||
- name: 💿Install extra Python Modules
|
||||
shell: bash
|
||||
run: |
|
||||
python -m pip install setuptools
|
||||
# run diagnostics
|
||||
- name: 🧮Print Diagnostics
|
||||
shell: bash
|
||||
run: |
|
||||
python -m source.meta.run_diags
|
||||
|
||||
# Test
|
||||
install-test:
|
||||
# test
|
||||
# call checkout
|
||||
# call install
|
||||
# run tests
|
||||
name: 💿/⏱️
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
continue-on-error: False
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
#TODO: OS List to run on
|
||||
os-name: [
|
||||
# ubuntu-latest, # ubuntu-22.04
|
||||
ubuntu-22.04,
|
||||
ubuntu-20.04,
|
||||
macos-latest, # macos-12
|
||||
windows-latest # windows-2022
|
||||
]
|
||||
#TODO: Python Version to run on
|
||||
python-version: [ "3.12" ]
|
||||
steps:
|
||||
# call checkout
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
# call install
|
||||
- name: 💿Call Install
|
||||
uses: ./.github/actions/install
|
||||
with:
|
||||
calling-job: test
|
||||
os-name: ${{ matrix.os-name }}
|
||||
python-version: ${{ matrix.python-version }}
|
||||
# call test
|
||||
- name: ⏱️Call Test
|
||||
uses: ./.github/actions/test
|
||||
with:
|
||||
os-name: ${{ matrix.os-name }}
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
# Prepare AppVersion
|
||||
appversion-prepare:
|
||||
# prepare appversion
|
||||
# call checkout
|
||||
# call install
|
||||
# call appversion-prepare
|
||||
name: 💬
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
needs: [install-test]
|
||||
continue-on-error: False
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
#TODO: OS List to run on
|
||||
os-name: [
|
||||
# ubuntu-latest, # ubuntu-22.04
|
||||
ubuntu-22.04,
|
||||
]
|
||||
#TODO: Python Version to run on
|
||||
python-version: [ "3.12" ]
|
||||
steps:
|
||||
# call checkout
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
# call install
|
||||
- name: 💿Call Install
|
||||
uses: ./.github/actions/install
|
||||
with:
|
||||
calling-job: appversion-prepare
|
||||
os-name: ${{ matrix.os-name }}
|
||||
python-version: ${{ matrix.python-version }}
|
||||
# call appversion-prepare
|
||||
- name: 💬Call Prepare AppVersion
|
||||
uses: ./.github/actions/appversion-prepare
|
||||
|
||||
# Build
|
||||
install-build:
|
||||
# build
|
||||
# call checkout
|
||||
# call install
|
||||
# call build
|
||||
name: 💿/🔨
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
needs: [appversion-prepare]
|
||||
continue-on-error: False
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
#TODO: OS List to run on
|
||||
os-name: [
|
||||
# ubuntu-latest, # ubuntu-22.04
|
||||
ubuntu-22.04,
|
||||
ubuntu-20.04,
|
||||
macos-latest, # macos-12
|
||||
windows-latest # windows-2022
|
||||
]
|
||||
#TODO: Python Version to run on
|
||||
python-version: [ "3.12" ]
|
||||
steps:
|
||||
# call checkout
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
# call install
|
||||
- name: 💿Call Install
|
||||
uses: ./.github/actions/install
|
||||
with:
|
||||
calling-job: build
|
||||
os-name: ${{ matrix.os-name }}
|
||||
python-version: ${{ matrix.python-version }}
|
||||
# call build
|
||||
- name: 🔨Call Build
|
||||
uses: ./.github/actions/build
|
||||
with:
|
||||
calling-job: build
|
||||
os-name: ${{ matrix.os-name }}
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
# Prepare Release
|
||||
release-prepare:
|
||||
# prepare release
|
||||
# call checkout
|
||||
# install extra python modules
|
||||
# call prepare release
|
||||
name: 💿/📀->📦
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
needs: [install-build]
|
||||
continue-on-error: False
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
#TODO: OS List to run on
|
||||
os-name: [
|
||||
# ubuntu-latest, # ubuntu-22.04
|
||||
ubuntu-22.04,
|
||||
ubuntu-20.04,
|
||||
macos-latest, # macos-12
|
||||
windows-latest # windows-2022
|
||||
]
|
||||
python-version: [ "3.12" ]
|
||||
steps:
|
||||
# call checkout
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
# install extra python modules
|
||||
- name: 💿Install extra Python Modules
|
||||
shell: bash
|
||||
run: |
|
||||
python -m pip install setuptools
|
||||
# call prepare release
|
||||
- name: 📀->📦Prepare Release
|
||||
uses: ./.github/actions/release-prepare
|
||||
with:
|
||||
os-name: ${{ matrix.os-name }}
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
# Deploy Release
|
||||
# Needs to be top-level for SECRET to work easily
|
||||
release-deploy:
|
||||
name: 📀->🚀
|
||||
runs-on: ${{ matrix.os-name }}
|
||||
needs: [release-prepare]
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
#TODO: OS List to run on
|
||||
os-name: [
|
||||
# ubuntu-latest, # ubuntu-22.04
|
||||
ubuntu-22.04,
|
||||
]
|
||||
#TODO: Python Version to run on
|
||||
python-version: [ "3.12" ]
|
||||
|
||||
steps:
|
||||
# checkout commit
|
||||
- name: ✔️Checkout commit
|
||||
uses: actions/checkout@v4.1.4
|
||||
|
||||
# install extra python modules
|
||||
- name: 💿Install extra Python Modules
|
||||
shell: bash
|
||||
run: |
|
||||
python -m pip install pytz requests
|
||||
|
||||
# get parent dir
|
||||
- name: 📁Get Parent Directory
|
||||
id: parentDir
|
||||
uses: ./.github/actions/get-parent-dir
|
||||
|
||||
# download appversion artifact
|
||||
- name: 🔽Download AppVersion Artifact
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
with:
|
||||
name: appversion
|
||||
path: ${{ steps.parentDir.outputs.parentDir }}/build
|
||||
|
||||
# download ubuntu archive artifact
|
||||
- name: 🔽Download Ubuntu Archive Artifact
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
with:
|
||||
# should run on latest explicit ubuntu version
|
||||
name: archive-ubuntu-22.04-py${{ matrix.python-version }}
|
||||
path: ${{ steps.parentDir.outputs.parentDir }}/deploy/linux
|
||||
|
||||
# download macos archive artifact
|
||||
- name: 🔽Download MacOS Archive Artifact
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
with:
|
||||
name: archive-macos-latest-py${{ matrix.python-version }}
|
||||
path: ${{ steps.parentDir.outputs.parentDir }}/deploy/macos
|
||||
|
||||
# download windows archive artifact
|
||||
- name: 🔽Download Windows Archive Artifact
|
||||
uses: actions/download-artifact@v4.1.7
|
||||
with:
|
||||
name: archive-windows-latest-py${{ matrix.python-version }}
|
||||
path: ${{ steps.parentDir.outputs.parentDir }}/deploy/windows
|
||||
|
||||
# determine linux archive asset
|
||||
- name: ❔Identify Linux Archive Asset
|
||||
id: identify-linux-asset
|
||||
shell: bash
|
||||
run: |
|
||||
ASSET_LINUX="$(ls ${{ steps.parentDir.outputs.parentDir }}/deploy/linux)"
|
||||
echo "asset_linux=$ASSET_LINUX" >> $GITHUB_OUTPUT
|
||||
|
||||
# determine macos archive asset
|
||||
- name: ❔Identify MacOS Archive Asset
|
||||
id: identify-macos-asset
|
||||
shell: bash
|
||||
run: |
|
||||
ASSET_MACOS="$(ls ${{ steps.parentDir.outputs.parentDir }}/deploy/macos)"
|
||||
echo "asset_macos=$ASSET_MACOS" >> $GITHUB_OUTPUT
|
||||
|
||||
# determine windows archive asset
|
||||
- name: ❔Identify Windows Archive Asset
|
||||
id: identify-windows-asset
|
||||
shell: bash
|
||||
run: |
|
||||
ASSET_WIN="$(ls ${{ steps.parentDir.outputs.parentDir }}/deploy/windows)"
|
||||
echo "asset_windows=$ASSET_WIN" >> $GITHUB_OUTPUT
|
||||
|
||||
# archive listing
|
||||
# - name: Archive Listing
|
||||
# shell: bash
|
||||
# run: |
|
||||
# ls -R ${{ steps.parentDir.outputs.parentDir }}/deploy/
|
||||
|
||||
# debug info
|
||||
#TODO: Project Name
|
||||
- name: 📝Debug Info
|
||||
id: debug_info
|
||||
run: |
|
||||
PROJECT_NAME="ALttPOverworldRandomizer"
|
||||
echo "project_name=$PROJECT_NAME" >> $GITHUB_OUTPUT
|
||||
|
||||
GITHUB_TAG="$(head -n 1 ../build/app_version.txt)"
|
||||
echo "github_tag=$GITHUB_TAG" >> $GITHUB_OUTPUT
|
||||
|
||||
RELEASE_NAME="${PROJECT_NAME} ${GITHUB_TAG}"
|
||||
echo "release_name=$RELEASE_NAME" >> $GITHUB_OUTPUT
|
||||
|
||||
ASSET_PREFIX="${PROJECT_NAME}-${GITHUB_TAG}"
|
||||
echo "asset_prefix=$ASSET_PREFIX" >> $GITHUB_OUTPUT
|
||||
|
||||
echo "Project Name: ${PROJECT_NAME}"
|
||||
echo "Release Name: ${RELEASE_NAME}"
|
||||
echo "Asset Prefix: ${ASSET_PREFIX}"
|
||||
echo "Git Tag: ${GITHUB_TAG}"
|
||||
echo "Linux Asset: ${{ steps.identify-linux-asset.outputs.asset_linux }}"
|
||||
echo "MacOS Asset: ${{ steps.identify-macos-asset.outputs.asset_macos }}"
|
||||
echo "Windows Asset: ${{ steps.identify-windows-asset.outputs.asset_windows }}"
|
||||
|
||||
# create a release (MASTER)
|
||||
#TODO: Make sure we updated RELEASENOTES.md
|
||||
#TODO: Make sure we're firing on the proper branches
|
||||
# if: contains(github.ref, 'master') # branch or tag name
|
||||
# if: contains(github.event.head_commit.message, 'Version bump') # commit message
|
||||
- name: 📀->🚀Create a Release (MASTER)
|
||||
id: create_release
|
||||
uses: actions/create-release@v1.1.4
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
tag_name: ${{ steps.debug_info.outputs.github_tag }}
|
||||
release_name: ${{ steps.debug_info.outputs.release_name }}
|
||||
body_path: RELEASENOTES.md
|
||||
# draft: true
|
||||
if: contains(github.ref, 'master')
|
||||
|
||||
# upload linux archive asset (MASTER)
|
||||
#TODO: Make sure we're firing on the proper branches
|
||||
- name: 🔼Upload Linux Archive Asset (MASTER)
|
||||
id: upload-linux-asset
|
||||
uses: actions/upload-release-asset@v1.0.2
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ${{ steps.parentDir.outputs.parentDir }}/deploy/linux/${{ steps.identify-linux-asset.outputs.asset_linux }}
|
||||
asset_name: ${{ steps.debug_info.outputs.asset_prefix }}-linux-focal.tar.gz
|
||||
asset_content_type: application/gzip
|
||||
if: contains(github.ref, 'master')
|
||||
|
||||
# upload macos archive asset (MASTER)
|
||||
#TODO: Make sure we're firing on the proper branches
|
||||
- name: 🔼Upload MacOS Archive Asset (MASTER)
|
||||
id: upload-macos-asset
|
||||
uses: actions/upload-release-asset@v1.0.2
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ${{ steps.parentDir.outputs.parentDir }}/deploy/macos/${{ steps.identify-macos-asset.outputs.asset_macos }}
|
||||
asset_name: ${{ steps.debug_info.outputs.asset_prefix }}-osx.tar.gz
|
||||
asset_content_type: application/gzip
|
||||
if: contains(github.ref, 'master')
|
||||
|
||||
# upload windows archive asset (MASTER)
|
||||
#TODO: Make sure we're firing on the proper branches
|
||||
- name: 🔼Upload Windows Archive Asset (MASTER)
|
||||
id: upload-windows-asset
|
||||
uses: actions/upload-release-asset@v1.0.2
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||
asset_path: ${{ steps.parentDir.outputs.parentDir }}/deploy/windows/${{ steps.identify-windows-asset.outputs.asset_windows }}
|
||||
asset_name: ${{ steps.debug_info.outputs.asset_prefix }}-windows.zip
|
||||
asset_content_type: application/zip
|
||||
if: contains(github.ref, 'master')
|
||||
7
.gitignore
vendored
7
.gitignore
vendored
@@ -13,7 +13,7 @@
|
||||
*.bst
|
||||
*.wixobj
|
||||
*.bat
|
||||
build
|
||||
/build
|
||||
bundle/components.wxs
|
||||
dist
|
||||
README.html
|
||||
@@ -40,7 +40,10 @@ resources/user/*
|
||||
get-pip.py
|
||||
|
||||
venv
|
||||
test
|
||||
/test
|
||||
test_games/
|
||||
data/sprites/official/selan.1.zspr
|
||||
*.zspr
|
||||
|
||||
*errors.txt
|
||||
*success.txt
|
||||
|
||||
@@ -23,7 +23,7 @@ def start():
|
||||
# print diagnostics
|
||||
# usage: py DungeonRandomizer.py --diags
|
||||
if args.diags:
|
||||
diags = diagnostics.output(__version__)
|
||||
diags = diagnostics.output()
|
||||
print("\n".join(diags))
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
14
Text.py
14
Text.py
@@ -2,6 +2,8 @@
|
||||
from collections import OrderedDict
|
||||
import logging
|
||||
import re
|
||||
import warnings
|
||||
warnings.filterwarnings("ignore", category=SyntaxWarning)
|
||||
|
||||
text_addresses = {'Pedestal': (0x180300, 256),
|
||||
'Triforce': (0x180400, 256),
|
||||
@@ -106,18 +108,18 @@ Triforce_texts = [
|
||||
" Whelp…\n that just\n happened",
|
||||
" Oh hey…\n it's you",
|
||||
"\n Wheeeeee!!",
|
||||
" Time for\n another one?",
|
||||
" Time for\n another one?",
|
||||
" And\n\n scene",
|
||||
"\n GOT EM!!",
|
||||
"\n THE VALUUUE!!!",
|
||||
" Cool seed,\n\n right?",
|
||||
"\n We did it!",
|
||||
" Spam those\n emotes in\n wilds chat",
|
||||
"\n O M G",
|
||||
"\n O M G",
|
||||
" Hello. Will you\n you be my friend?",
|
||||
" Beetorp\n was\n here!",
|
||||
" The Wind Fish\n will wake soon.\n Hoot!",
|
||||
" Meow Meow Meow\n Meow Meow Meow\n Oh My God!",
|
||||
" Meow Meow Meow\n Meow Meow Meow\n Oh my god!",
|
||||
" Ahhhhhhhhh\n Ya ya yaaaah\n Ya ya yaaah",
|
||||
" .done\n\n .comment lol",
|
||||
" You get to\n drink from\n the firehose",
|
||||
@@ -645,7 +647,7 @@ class MultiByteCoreTextMapper(object):
|
||||
linespace = wrap
|
||||
line = lines.pop(0)
|
||||
|
||||
match = re.search('^\{[A-Z0-9_:]+\}$', line)
|
||||
match = re.search(r'^\{[A-Z0-9_:]+\}$', line)
|
||||
if match:
|
||||
if line == '{PAGEBREAK}':
|
||||
if lineindex % 3 != 0:
|
||||
@@ -664,13 +666,13 @@ class MultiByteCoreTextMapper(object):
|
||||
while words:
|
||||
word = words.pop(0)
|
||||
|
||||
match = re.search('^(\{[A-Z0-9_:]+\}).*', word)
|
||||
match = re.search(r'^(\{[A-Z0-9_:]+\}).*', word)
|
||||
if match:
|
||||
start_command = match.group(1)
|
||||
outbuf.extend(cls.special_commands[start_command])
|
||||
word = word.replace(start_command, '')
|
||||
|
||||
match = re.search('(\{[A-Z0-9_:]+\})\.?$', word)
|
||||
match = re.search(r'(\{[A-Z0-9_:]+\})\.?$', word)
|
||||
if match:
|
||||
end_command = match.group(1)
|
||||
word = word.replace(end_command, '')
|
||||
|
||||
@@ -98,8 +98,8 @@ compass_shuffle:
|
||||
on: 1
|
||||
off: 1
|
||||
smallkey_shuffle:
|
||||
on: 1
|
||||
off: 1
|
||||
wild: 1
|
||||
none: 1
|
||||
bigkey_shuffle:
|
||||
on: 1
|
||||
off: 1
|
||||
|
||||
0
resources/app/meta/manifests/app_version.txt
Normal file
0
resources/app/meta/manifests/app_version.txt
Normal file
7
resources/app/meta/manifests/binaries.json
Normal file
7
resources/app/meta/manifests/binaries.json
Normal file
@@ -0,0 +1,7 @@
|
||||
[
|
||||
"DungeonRandomizer",
|
||||
"Gui",
|
||||
"MultiClient",
|
||||
"MultiServer",
|
||||
"Mystery"
|
||||
]
|
||||
34
resources/app/meta/manifests/excluded_dlls.json
Normal file
34
resources/app/meta/manifests/excluded_dlls.json
Normal file
@@ -0,0 +1,34 @@
|
||||
[
|
||||
"conio",
|
||||
"console",
|
||||
"convert",
|
||||
"datetime",
|
||||
"debug",
|
||||
"environment",
|
||||
"errorhandling",
|
||||
"file",
|
||||
"filesystem",
|
||||
"handle",
|
||||
"heap",
|
||||
"interlocked",
|
||||
"libraryloader",
|
||||
"locale",
|
||||
"localization",
|
||||
"math",
|
||||
"memory",
|
||||
"namedpipe",
|
||||
"process",
|
||||
"processenvironment",
|
||||
"processthreads",
|
||||
"profile",
|
||||
"rtlsupport",
|
||||
"runtime",
|
||||
"stdio",
|
||||
"string",
|
||||
"synch",
|
||||
"sysinfo",
|
||||
"time",
|
||||
"timezone",
|
||||
"util",
|
||||
"utility"
|
||||
]
|
||||
@@ -1,7 +1,8 @@
|
||||
aenum
|
||||
aioconsole
|
||||
colorama
|
||||
distro
|
||||
fast-enum
|
||||
python-bps-continued
|
||||
colorama
|
||||
aioconsole
|
||||
pyyaml
|
||||
websockets
|
||||
pyyaml
|
||||
@@ -1,6 +1,11 @@
|
||||
import os # for env vars
|
||||
import stat # file statistics
|
||||
import sys # default system info
|
||||
try:
|
||||
import distro
|
||||
except ModuleNotFoundError as e:
|
||||
pass
|
||||
|
||||
from my_path import get_py_path
|
||||
|
||||
global UBUNTU_VERSIONS
|
||||
@@ -8,15 +13,20 @@ global DEFAULT_EVENT
|
||||
global DEFAULT_REPO_SLUG
|
||||
global FILENAME_CHECKS
|
||||
global FILESIZE_CHECK
|
||||
UBUNTU_VERSIONS = {
|
||||
"latest": "focal",
|
||||
"20.04": "focal",
|
||||
"18.04": "bionic",
|
||||
"16.04": "xenial"
|
||||
}
|
||||
# GitHub Hosted Runners
|
||||
# https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners/about-github-hosted-runners#standard-github-hosted-runners-for-public-repositories
|
||||
# ubuntu: 22.04, 20.04
|
||||
# windows: 2022, 2019
|
||||
# macos: 14, 13, 12, 11
|
||||
DEFAULT_EVENT = "event"
|
||||
DEFAULT_REPO_SLUG = "miketrethewey/ALttPDoorRandomizer"
|
||||
FILENAME_CHECKS = [ "Gui", "DungeonRandomizer" ]
|
||||
FILENAME_CHECKS = [
|
||||
"DungeonRandomizer",
|
||||
"Gui",
|
||||
"MultiClient",
|
||||
"MultiServer",
|
||||
"Mystery"
|
||||
]
|
||||
FILESIZE_CHECK = (6 * 1024 * 1024) # 6MB
|
||||
|
||||
# take number of bytes and convert to string with units measure
|
||||
@@ -38,12 +48,19 @@ def prepare_env():
|
||||
global DEFAULT_REPO_SLUG
|
||||
env = {}
|
||||
|
||||
# get app version
|
||||
# get app version
|
||||
APP_VERSION = ""
|
||||
APP_VERSION_FILE = os.path.join(".","resources","app","meta","manifests","app_version.txt")
|
||||
if os.path.isfile(APP_VERSION_FILE):
|
||||
with open(APP_VERSION_FILE,"r") as f:
|
||||
APP_VERSION = f.readlines()[0].strip()
|
||||
APP_VERSION_FILES = [
|
||||
os.path.join(".","resources","app","meta","manifests","app_version.txt"),
|
||||
os.path.join("..","build","app_version.txt")
|
||||
]
|
||||
for app_version_file in APP_VERSION_FILES:
|
||||
if os.path.isfile(app_version_file):
|
||||
with open(app_version_file,"r") as f:
|
||||
lines = f.readlines()
|
||||
if len(lines) > 0:
|
||||
APP_VERSION = lines[0].strip()
|
||||
|
||||
# ci data
|
||||
env["CI_SYSTEM"] = os.getenv("CI_SYSTEM","")
|
||||
# py data
|
||||
@@ -96,9 +113,11 @@ def prepare_env():
|
||||
OS_VERSION = OS_NAME[OS_NAME.find('-')+1:]
|
||||
OS_NAME = OS_NAME[:OS_NAME.find('-')]
|
||||
if OS_NAME == "linux" or OS_NAME == "ubuntu":
|
||||
if OS_VERSION in UBUNTU_VERSIONS:
|
||||
OS_VERSION = UBUNTU_VERSIONS[OS_VERSION]
|
||||
OS_DIST = OS_VERSION
|
||||
try:
|
||||
if distro.codename() != "":
|
||||
OS_DIST = distro.codename()
|
||||
except NameError as e:
|
||||
pass
|
||||
|
||||
if OS_VERSION == "" and not OS_DIST == "" and not OS_DIST == "notset":
|
||||
OS_VERSION = OS_DIST
|
||||
@@ -111,7 +130,7 @@ def prepare_env():
|
||||
# if the app version didn't have the build number, add it
|
||||
# set to <app_version>.<build_number>
|
||||
if env["BUILD_NUMBER"] not in GITHUB_TAG:
|
||||
GITHUB_TAG += '.' + env["BUILD_NUMBER"]
|
||||
GITHUB_TAG += ".r" + env["BUILD_NUMBER"]
|
||||
|
||||
env["GITHUB_TAG"] = GITHUB_TAG
|
||||
env["OS_NAME"] = OS_NAME
|
||||
|
||||
@@ -10,7 +10,7 @@ def get_get_pip(PY_VERSION):
|
||||
try:
|
||||
import pip
|
||||
except ImportError:
|
||||
print("Getting pip getter!")
|
||||
print("🟡Getting pip getter!")
|
||||
#make the request!
|
||||
url = "https://bootstrap.pypa.io/get-pip.py"
|
||||
context = ssl._create_unverified_context()
|
||||
@@ -40,7 +40,7 @@ def get_get_pip(PY_VERSION):
|
||||
if float(PY_VERSION) > 0:
|
||||
PYTHON_EXECUTABLE = "py"
|
||||
|
||||
print("Getting pip!")
|
||||
print("🟡Getting pip!")
|
||||
args = [
|
||||
env["PYTHON_EXE_PATH"] + PYTHON_EXECUTABLE,
|
||||
'-' + str(PY_VERSION),
|
||||
@@ -58,6 +58,6 @@ if __name__ == "__main__":
|
||||
|
||||
try:
|
||||
import pip
|
||||
print("pip is installed")
|
||||
print("🟢pip is installed")
|
||||
except ImportError:
|
||||
get_get_pip(PY_VERSION)
|
||||
|
||||
440
resources/ci/common/get_pipline.py
Normal file
440
resources/ci/common/get_pipline.py
Normal file
@@ -0,0 +1,440 @@
|
||||
# import modules
|
||||
import common # app common functions
|
||||
|
||||
import json # json manipulation
|
||||
import os # for os data, filesystem manipulation
|
||||
import subprocess # for running shell commands
|
||||
import sys # for system commands
|
||||
import traceback # for errors
|
||||
|
||||
# get env
|
||||
env = common.prepare_env() # get environment variables
|
||||
|
||||
# width for labels
|
||||
WIDTH = 70
|
||||
|
||||
# bucket for cli args
|
||||
args = []
|
||||
|
||||
# pip exe path
|
||||
PIPEXE = ""
|
||||
|
||||
# py exe path
|
||||
# py version
|
||||
# py minor version
|
||||
PYTHON_EXECUTABLE = os.path.splitext(sys.executable.split(os.path.sep).pop())[0] # get command to run python
|
||||
PYTHON_VERSION = sys.version.split(" ")[0]
|
||||
PYTHON_MINOR_VERSION = '.'.join(PYTHON_VERSION.split(".")[:2])
|
||||
|
||||
# pip string version
|
||||
# pip float version
|
||||
PIP_VERSION = ""
|
||||
PIP_FLOAT_VERSION = 0
|
||||
|
||||
# success
|
||||
SUCCESS = False
|
||||
# bucket for versions
|
||||
VERSIONS = {}
|
||||
|
||||
# process module output
|
||||
# read output from installing
|
||||
# print relevant info
|
||||
# print unknown stuff
|
||||
def process_module_output(lines):
|
||||
for line in lines:
|
||||
# if there's an error, print it and bail
|
||||
if "status 'error'" in line.strip():
|
||||
print(
|
||||
"🔴[%s] %s"
|
||||
%
|
||||
(
|
||||
"_",
|
||||
line.strip()
|
||||
)
|
||||
)
|
||||
return
|
||||
# sys.exit(1)
|
||||
# if it's already satisfied or building a wheel, print version data
|
||||
elif "already satisfied" in line or \
|
||||
"Building wheel" in line or \
|
||||
"Created wheel" in line:
|
||||
|
||||
modulename = print_module_line(line)
|
||||
|
||||
if "=" not in modulename and VERSIONS[modulename]["installed"] != VERSIONS[modulename]["latest"]:
|
||||
# install modules from list
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"install",
|
||||
"--upgrade",
|
||||
f"{modulename}"
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
# if there's output
|
||||
if ret.stdout.strip():
|
||||
process_module_output(ret.stdout.strip().split("\n"))
|
||||
|
||||
# ignore lines about certain things
|
||||
elif "Attempting uninstall" in line or \
|
||||
"Collecting" in line or \
|
||||
"Downloading" in line or \
|
||||
"eta 0:00:00" in line or \
|
||||
"Found existing" in line or \
|
||||
"Installing collected" in line or \
|
||||
"Preparing metadata" in line or \
|
||||
"Successfully built" in line or \
|
||||
"Successfully installed" in line or \
|
||||
"Successfully uninstalled" in line or \
|
||||
"Stored in" in line or \
|
||||
"Uninstalling " in line or \
|
||||
"Using cached" in line:
|
||||
pass
|
||||
# else, I don't know what it is, print it
|
||||
else:
|
||||
print(line.strip())
|
||||
print("")
|
||||
|
||||
# print module line
|
||||
# name, installed version, latest version
|
||||
def print_module_line(line):
|
||||
global VERSIONS
|
||||
# is it already installed?
|
||||
satisfied = line.strip().split(" in ")
|
||||
# get the installed version
|
||||
sver = ((len(satisfied) > 1) and satisfied[1].split("(").pop().replace(")", "")) or ""
|
||||
|
||||
# if we're making a wheel
|
||||
if "Created wheel" in line:
|
||||
line = line.strip().split(':')
|
||||
satisfied = [line[0]]
|
||||
sver = line[1].split('-')[1]
|
||||
|
||||
# get module name
|
||||
modulename = satisfied[0].replace("Requirement already satisfied: ", "")
|
||||
# save info for later use
|
||||
VERSIONS[modulename] = {
|
||||
"installed": sver,
|
||||
"latest": (sver and get_module_version(satisfied[0].split(" ")[-1])).strip() or ""
|
||||
}
|
||||
|
||||
# print what we found
|
||||
print(
|
||||
(
|
||||
"[%s] %s\t%s\t%s"
|
||||
%
|
||||
(
|
||||
"Building wheel" in line and '.' or "X",
|
||||
satisfied[0].ljust(len("Requirement already satisfied: ") + len("python-bps-continued")),
|
||||
VERSIONS[modulename]["installed"],
|
||||
VERSIONS[modulename]["latest"]
|
||||
)
|
||||
)
|
||||
)
|
||||
# return the name of this module
|
||||
return modulename
|
||||
|
||||
# get module version
|
||||
# get installed version
|
||||
def get_module_version(module):
|
||||
# pip index versions [module] // >= 21.2
|
||||
# pip install [module]== // >= 21.1
|
||||
# pip install --use-deprecated=legacy-resolver [module]== // >= 20.3
|
||||
# pip install [module]== // >= 9.0
|
||||
# pip install [module]==blork // < 9.0
|
||||
global args
|
||||
global PIPEXE
|
||||
global PIP_FLOAT_VERSION
|
||||
ret = ""
|
||||
ver = ""
|
||||
|
||||
# based on version of pip, get the installation status of a module
|
||||
if float(PIP_FLOAT_VERSION) >= 21.2:
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"index",
|
||||
"versions",
|
||||
module
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
lines = ret.stdout.strip().split("\n")
|
||||
lines = lines[2::]
|
||||
vers = (list(map(lambda x: x.split(' ')[-1], lines)))
|
||||
if len(vers) > 1:
|
||||
ver = vers[1]
|
||||
elif float(PIP_FLOAT_VERSION) >= 21.1:
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"install",
|
||||
f"{module}=="
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
elif float(PIP_FLOAT_VERSION) >= 20.3:
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"install",
|
||||
"--use-deprecated=legacy-resolver",
|
||||
f"{module}=="
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
elif float(PIP_FLOAT_VERSION) >= 9.0:
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"install",
|
||||
f"{module}=="
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
elif float(PIP_FLOAT_VERSION) < 9.0:
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"install",
|
||||
f"{module}==blork"
|
||||
],
|
||||
capture_output=True,
|
||||
ext=True
|
||||
)
|
||||
|
||||
# if ver == "" and ret.stderr.strip():
|
||||
# ver = (ret.stderr.strip().split("\n")[0].split(",")[-1].replace(')', '')).strip()
|
||||
|
||||
# return what we found
|
||||
return ver
|
||||
|
||||
# get python info
|
||||
def python_info():
|
||||
global args
|
||||
global PYTHON_VERSION
|
||||
|
||||
# get python debug info
|
||||
ret = subprocess.run([*args, "--version"], capture_output=True, text=True)
|
||||
if ret.stdout.strip():
|
||||
PYTHON_VERSION = ret.stdout.strip().split(" ")[1]
|
||||
PY_STRING = (
|
||||
"%s\t%s\t%s"
|
||||
%
|
||||
(
|
||||
((isinstance(args[0], list) and " ".join(
|
||||
args[0])) or args[0]).strip(),
|
||||
PYTHON_VERSION,
|
||||
sys.platform
|
||||
)
|
||||
)
|
||||
print(PY_STRING)
|
||||
print('.' * WIDTH)
|
||||
|
||||
# get pip info
|
||||
def pip_info():
|
||||
global args
|
||||
global PIPEXE
|
||||
global PIPEXE
|
||||
global VERSIONS
|
||||
|
||||
# get pip debug info
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"--version"
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
if ret.stdout.strip():
|
||||
if " from " in ret.stdout.strip():
|
||||
PIP_VERSION = ret.stdout.strip().split(" from ")[0].split(" ")[1]
|
||||
if PIP_VERSION:
|
||||
b, f, a = PIP_VERSION.partition('.')
|
||||
global PIP_FLOAT_VERSION
|
||||
PIP_FLOAT_VERSION = b+f+a.replace('.', '')
|
||||
PIP_LATEST = get_module_version("pip")
|
||||
|
||||
VERSIONS["py"] = {
|
||||
"version": PYTHON_VERSION,
|
||||
"platform": sys.platform
|
||||
}
|
||||
VERSIONS["pip"] = {
|
||||
"version": [
|
||||
PIP_VERSION,
|
||||
PIP_FLOAT_VERSION
|
||||
],
|
||||
"latest": PIP_LATEST
|
||||
}
|
||||
|
||||
PIP_STRING = (
|
||||
"%s\t%s\t%s\t%s\t%s\t%s"
|
||||
%
|
||||
(
|
||||
((isinstance(args[0], list) and " ".join(
|
||||
args[0])) or args[0]).strip(),
|
||||
PYTHON_VERSION,
|
||||
sys.platform,
|
||||
PIPEXE,
|
||||
PIP_VERSION,
|
||||
PIP_LATEST
|
||||
)
|
||||
)
|
||||
print(PIP_STRING)
|
||||
print('.' * WIDTH)
|
||||
|
||||
# upgrade pip
|
||||
def pip_upgrade():
|
||||
global args
|
||||
global PIPEXE
|
||||
|
||||
# upgrade pip
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"install",
|
||||
"--upgrade", "pip"
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
# get output
|
||||
if ret.stdout.strip():
|
||||
# if it's not already satisfied, update it
|
||||
if "already satisfied" not in ret.stdout.strip():
|
||||
print(ret.stdout.strip())
|
||||
pip_info()
|
||||
|
||||
# install modules
|
||||
def install_modules():
|
||||
global args
|
||||
global PIPEXE
|
||||
global SUCCESS
|
||||
|
||||
# install modules from list
|
||||
ret = subprocess.run(
|
||||
[
|
||||
*args,
|
||||
"-m",
|
||||
PIPEXE,
|
||||
"install",
|
||||
"-r",
|
||||
os.path.join(
|
||||
".",
|
||||
"resources",
|
||||
"app",
|
||||
"meta",
|
||||
"manifests",
|
||||
"pip_requirements.txt"
|
||||
)
|
||||
],
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
|
||||
# if there's output
|
||||
if ret.stdout.strip():
|
||||
process_module_output(ret.stdout.strip().split("\n"))
|
||||
manifests_path = os.path.join(".", "resources", "user", "meta", "manifests")
|
||||
if not os.path.isdir(manifests_path):
|
||||
os.makedirs(manifests_path)
|
||||
|
||||
with open(os.path.join(manifests_path, "settings.json"), "w+") as settings:
|
||||
settings.write(
|
||||
json.dumps(
|
||||
{
|
||||
"py": args,
|
||||
"pip": PIPEXE,
|
||||
"pipline": " ".join(args) + " -m " + PIPEXE,
|
||||
"versions": VERSIONS
|
||||
},
|
||||
indent=2
|
||||
)
|
||||
)
|
||||
with open(os.path.join(manifests_path, "pipline.txt"), "w+") as settings:
|
||||
settings.write(" ".join(args) + " -m " + PIPEXE)
|
||||
SUCCESS = True
|
||||
|
||||
|
||||
def main():
|
||||
global args
|
||||
global PIPEXE
|
||||
global SUCCESS
|
||||
# print python debug info
|
||||
heading = (
|
||||
"%s-%s-%s"
|
||||
%
|
||||
(
|
||||
PYTHON_EXECUTABLE,
|
||||
PYTHON_VERSION,
|
||||
sys.platform
|
||||
)
|
||||
)
|
||||
print(heading)
|
||||
print('=' * WIDTH)
|
||||
|
||||
# figure out pip executable
|
||||
PIPEXE = "pip" if "windows" in env["OS_NAME"] else "pip3"
|
||||
PIPEXE = "pip" if "osx" in env["OS_NAME"] and "actions" in env["CI_SYSTEM"] else PIPEXE
|
||||
|
||||
PIP_VERSION = "" # holder for pip's version
|
||||
|
||||
SUCCESS = False
|
||||
# foreach py executable
|
||||
for PYEXE in ["py", "python3", "python"]:
|
||||
if SUCCESS:
|
||||
continue
|
||||
|
||||
args = []
|
||||
# if it's the py launcher, specify the version
|
||||
if PYEXE == "py":
|
||||
PYEXE = [PYEXE, "-" + PYTHON_MINOR_VERSION]
|
||||
# if it ain't windows, skip it
|
||||
if "windows" not in env["OS_NAME"]:
|
||||
continue
|
||||
|
||||
# build executable command
|
||||
if isinstance(PYEXE, list):
|
||||
args = [*PYEXE]
|
||||
else:
|
||||
args = [PYEXE]
|
||||
|
||||
try:
|
||||
python_info()
|
||||
|
||||
# foreach pip executable
|
||||
for PIPEXE in ["pip3", "pip"]:
|
||||
pip_info()
|
||||
pip_upgrade()
|
||||
install_modules()
|
||||
|
||||
# if something else went fucky, print it
|
||||
except Exception as e:
|
||||
traceback.print_exc()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -21,10 +21,11 @@ if not os.path.isdir(os.path.join(".","upx")):
|
||||
UPX_FILE = UPX_SLUG + ".tar.xz"
|
||||
UPX_URL = "https://github.com/upx/upx/releases/download/v" + UPX_VERSION + '/' + UPX_FILE
|
||||
|
||||
# if it's not macos
|
||||
if "osx" not in env["OS_NAME"]:
|
||||
|
||||
print("Getting UPX: " + UPX_FILE)
|
||||
|
||||
# download UPX
|
||||
with open(os.path.join(".",UPX_FILE),"wb") as upx:
|
||||
UPX_REQ = urllib.request.Request(
|
||||
UPX_URL,
|
||||
@@ -34,8 +35,10 @@ if not os.path.isdir(os.path.join(".","upx")):
|
||||
UPX_DATA = UPX_REQ.read()
|
||||
upx.write(UPX_DATA)
|
||||
|
||||
# extract UPX
|
||||
unpack_archive(UPX_FILE,os.path.join("."))
|
||||
|
||||
# move UPX
|
||||
os.rename(os.path.join(".",UPX_SLUG),os.path.join(".","upx"))
|
||||
os.remove(os.path.join(".",UPX_FILE))
|
||||
|
||||
|
||||
168
resources/ci/common/list_actions.py
Normal file
168
resources/ci/common/list_actions.py
Normal file
@@ -0,0 +1,168 @@
|
||||
# pylint: disable=invalid-name
|
||||
'''
|
||||
List GitHub Actions versions used and latest versions
|
||||
'''
|
||||
import json
|
||||
import os
|
||||
import ssl
|
||||
import urllib.request
|
||||
import yaml
|
||||
from json.decoder import JSONDecodeError
|
||||
|
||||
allACTIONS = {}
|
||||
listACTIONS = []
|
||||
|
||||
VER_WIDTH = 10
|
||||
NAME_WIDTH = 40
|
||||
LINE_WIDTH = 1 + NAME_WIDTH + 5 + VER_WIDTH + 5 + VER_WIDTH + 1
|
||||
|
||||
def process_walk(key, node):
|
||||
'''
|
||||
Process walking through the array
|
||||
'''
|
||||
global allACTIONS
|
||||
global listACTIONS
|
||||
if key == "uses":
|
||||
action = node.split('@')
|
||||
version = ""
|
||||
if '@' in node:
|
||||
version = action[1]
|
||||
action = action[0]
|
||||
if action not in allACTIONS:
|
||||
allACTIONS[action] = {
|
||||
"versions": [],
|
||||
"latest": ""
|
||||
}
|
||||
allACTIONS[action]["versions"].append(version)
|
||||
allACTIONS[action]["versions"] = list(
|
||||
set(
|
||||
allACTIONS[action]["versions"]
|
||||
)
|
||||
)
|
||||
listACTIONS.append(node)
|
||||
|
||||
|
||||
def walk(key, node):
|
||||
'''
|
||||
How to walk through the array
|
||||
'''
|
||||
if isinstance(node, dict):
|
||||
return {k: walk(k, v) for k, v in node.items()}
|
||||
elif isinstance(node, list):
|
||||
return [walk(key, x) for x in node]
|
||||
else:
|
||||
return process_walk(key, node)
|
||||
|
||||
|
||||
for r, d, f in os.walk(os.path.join(".", ".github")):
|
||||
if "actions" in r or "workflows" in r:
|
||||
for filename in f:
|
||||
# if it's not a YAML or it's turned off, skip it
|
||||
if (".yml" not in filename and ".yaml" not in filename) or (".off" in filename):
|
||||
continue
|
||||
listACTIONS = []
|
||||
# print filename
|
||||
filename_line = "-" * (len(os.path.join(r, filename)) + 2)
|
||||
print(
|
||||
" " +
|
||||
filename_line +
|
||||
" "
|
||||
)
|
||||
print("| " + os.path.join(r, filename) + " |")
|
||||
# read the file
|
||||
with(open(os.path.join(r, filename), "r", encoding="utf-8")) as yamlFile:
|
||||
print(
|
||||
"|" +
|
||||
filename_line +
|
||||
"-" +
|
||||
("-" * (LINE_WIDTH - len(filename_line) + 1)) +
|
||||
" "
|
||||
)
|
||||
yml = yaml.safe_load(yamlFile)
|
||||
walk("uses", yml)
|
||||
dictACTIONS = {}
|
||||
for k in sorted(list(set(listACTIONS))):
|
||||
action = k.split('@')[0]
|
||||
version = k.split('@')[1] if '@' in k else ""
|
||||
latest = ""
|
||||
# if it's not a location action, get the latest version number
|
||||
if "./." not in action:
|
||||
apiURL = f"https://api.github.com/repos/{action}/releases/latest"
|
||||
if True:
|
||||
apiReq = None
|
||||
try:
|
||||
apiReq = urllib.request.urlopen(
|
||||
apiURL,
|
||||
context=ssl._create_unverified_context()
|
||||
)
|
||||
except urllib.error.HTTPError as e:
|
||||
if e.code != 403:
|
||||
print(e.code, apiURL)
|
||||
if apiReq:
|
||||
apiRes = {}
|
||||
try:
|
||||
apiRes = json.loads(
|
||||
apiReq.read().decode("utf-8"))
|
||||
except JSONDecodeError as e:
|
||||
raise ValueError("🔴API Request failed: " + apiURL)
|
||||
if apiRes:
|
||||
latest = apiRes["tag_name"] if "tag_name" in apiRes else ""
|
||||
if latest != "":
|
||||
allACTIONS[action]["latest"] = latest
|
||||
dictACTIONS[action] = version
|
||||
# print action name and version info
|
||||
for action, version in dictACTIONS.items():
|
||||
print(
|
||||
"| " + \
|
||||
f"{action.ljust(NAME_WIDTH)}" + \
|
||||
"\t" + \
|
||||
f"{(version or 'N/A').ljust(VER_WIDTH)}" + \
|
||||
"\t" + \
|
||||
f"{(allACTIONS[action]['latest'] or 'N/A').ljust(VER_WIDTH)}" + \
|
||||
" |"
|
||||
)
|
||||
print(
|
||||
" " +
|
||||
("-" * (LINE_WIDTH + 2)) +
|
||||
" "
|
||||
)
|
||||
print("")
|
||||
|
||||
# print outdated versions summary
|
||||
first = True
|
||||
outdated = False
|
||||
for action, actionData in allACTIONS.items():
|
||||
if len(actionData["versions"]) > 0:
|
||||
if actionData["latest"] != "" and actionData["versions"][0] != actionData["latest"]:
|
||||
outdated = True
|
||||
if first:
|
||||
first = False
|
||||
filename_line = "-" * (len("| Outdated |"))
|
||||
print(
|
||||
" " +
|
||||
filename_line +
|
||||
" "
|
||||
)
|
||||
print("| 🔴Outdated |")
|
||||
print(
|
||||
"|" +
|
||||
filename_line +
|
||||
"-" +
|
||||
("-" * (LINE_WIDTH - len(filename_line) + 1)) +
|
||||
" "
|
||||
)
|
||||
print(
|
||||
"| " + \
|
||||
f"{action.ljust(40)}" + \
|
||||
"\t" + \
|
||||
f"{(','.join(actionData['versions']) or 'N/A').ljust(10)}" + \
|
||||
"\t" + \
|
||||
f"{actionData['latest'].ljust(10)}" + \
|
||||
" |"
|
||||
)
|
||||
if outdated:
|
||||
print(
|
||||
" " +
|
||||
("-" * (LINE_WIDTH + 2)) +
|
||||
" "
|
||||
)
|
||||
@@ -5,12 +5,12 @@ from shutil import copy # file manipulation
|
||||
env = common.prepare_env()
|
||||
|
||||
# set tag to app_version.txt
|
||||
if not env["GITHUB_TAG"] == "":
|
||||
with open(os.path.join(".","resources","app","meta","manifests","app_version.txt"),"w+") as f:
|
||||
_ = f.read()
|
||||
f.seek(0)
|
||||
f.write(env["GITHUB_TAG"])
|
||||
f.truncate()
|
||||
# if not env["GITHUB_TAG"] == "":
|
||||
# with open(os.path.join(".","resources","app","meta","manifests","app_version.txt"),"w+") as f:
|
||||
# _ = f.read()
|
||||
# f.seek(0)
|
||||
# f.write(env["GITHUB_TAG"])
|
||||
# f.truncate()
|
||||
|
||||
if not os.path.isdir(os.path.join("..","build")):
|
||||
os.mkdir(os.path.join("..","build"))
|
||||
|
||||
@@ -1,42 +1,48 @@
|
||||
import distutils.dir_util # for copying trees
|
||||
"""
|
||||
Locate and prepare binary builds
|
||||
"""
|
||||
# import distutils.dir_util # for copying trees
|
||||
import os # for env vars
|
||||
import stat # for file stats
|
||||
import subprocess # do stuff at the shell level
|
||||
# import stat # for file stats
|
||||
# import subprocess # do stuff at the shell level
|
||||
import common
|
||||
from shutil import copy, make_archive, move, rmtree # file manipulation
|
||||
from shutil import move # file manipulation
|
||||
|
||||
env = common.prepare_env()
|
||||
|
||||
# make dir to put the binary in
|
||||
if not os.path.isdir(os.path.join("..","artifact")):
|
||||
os.mkdir(os.path.join("..","artifact"))
|
||||
os.mkdir(os.path.join("..","artifact"))
|
||||
|
||||
BUILD_FILENAME = ""
|
||||
|
||||
# list executables
|
||||
BUILD_FILENAME = common.find_binary('.')
|
||||
if BUILD_FILENAME == "":
|
||||
BUILD_FILENAME = common.find_binary(os.path.join("..","artifact"))
|
||||
BUILD_FILENAME = common.find_binary(os.path.join("..","artifact"))
|
||||
|
||||
if isinstance(BUILD_FILENAME,str):
|
||||
BUILD_FILENAME = list(BUILD_FILENAME)
|
||||
BUILD_FILENAME = list(BUILD_FILENAME)
|
||||
|
||||
BUILD_FILENAMES = BUILD_FILENAME
|
||||
|
||||
print("OS Name: " + env["OS_NAME"])
|
||||
print("OS Version: " + env["OS_VERSION"])
|
||||
print("OS Distribution: " + env["OS_DIST"])
|
||||
print("")
|
||||
for BUILD_FILENAME in BUILD_FILENAMES:
|
||||
DEST_FILENAME = common.prepare_filename(BUILD_FILENAME)
|
||||
DEST_FILENAME = common.prepare_filename(BUILD_FILENAME)
|
||||
|
||||
print("OS Name: " + env["OS_NAME"])
|
||||
print("OS Version: " + env["OS_VERSION"])
|
||||
print("Build Filename: " + BUILD_FILENAME)
|
||||
print("Dest Filename: " + DEST_FILENAME)
|
||||
if not BUILD_FILENAME == "":
|
||||
print("Build Filesize: " + common.file_size(BUILD_FILENAME))
|
||||
else:
|
||||
exit(1)
|
||||
print("Build Filename: " + BUILD_FILENAME)
|
||||
print("Dest Filename: " + DEST_FILENAME)
|
||||
if not BUILD_FILENAME == "":
|
||||
print("Build Filesize: " + common.file_size(BUILD_FILENAME))
|
||||
else:
|
||||
exit(1)
|
||||
|
||||
if not BUILD_FILENAME == "":
|
||||
move(
|
||||
os.path.join(".",BUILD_FILENAME),
|
||||
os.path.join("..","artifact",BUILD_FILENAME)
|
||||
)
|
||||
if not BUILD_FILENAME == "":
|
||||
move(
|
||||
os.path.join(".",BUILD_FILENAME),
|
||||
os.path.join("..","artifact",BUILD_FILENAME)
|
||||
)
|
||||
print("")
|
||||
|
||||
@@ -101,7 +101,8 @@ if len(BUILD_FILENAMES) > 0:
|
||||
# .zip if windows
|
||||
# .tar.gz otherwise
|
||||
if len(BUILD_FILENAMES) > 1:
|
||||
ZIP_FILENAME = os.path.join("..","deploy",env["REPO_NAME"])
|
||||
# ZIP_FILENAME = os.path.join("..","deploy",env["REPO_NAME"])
|
||||
ZIP_FILENAME = os.path.join("..","deploy","ALttPOverworldRandomizer")
|
||||
else:
|
||||
ZIP_FILENAME = os.path.join("..","deploy",os.path.splitext(BUILD_FILENAME)[0])
|
||||
if env["OS_NAME"] == "windows":
|
||||
@@ -124,15 +125,15 @@ for BUILD_FILENAME in BUILD_FILENAMES:
|
||||
print("Build Filename: " + BUILD_FILENAME)
|
||||
print("Build Filesize: " + common.file_size(BUILD_FILENAME))
|
||||
else:
|
||||
print("No Build to prepare: " + BUILD_FILENAME)
|
||||
print("🟡No Build to prepare: " + BUILD_FILENAME)
|
||||
|
||||
if not ZIP_FILENAME == "":
|
||||
print("Zip Filename: " + ZIP_FILENAME)
|
||||
print("Zip Filesize: " + common.file_size(ZIP_FILENAME))
|
||||
else:
|
||||
print("No Zip to prepare: " + ZIP_FILENAME)
|
||||
print("🟡No Zip to prepare: " + ZIP_FILENAME)
|
||||
|
||||
print("Git tag: " + env["GITHUB_TAG"])
|
||||
print("App Version: " + env["GITHUB_TAG"])
|
||||
|
||||
if (len(BUILD_FILENAMES) == 0) or (ZIP_FILENAME == ""):
|
||||
exit(1)
|
||||
|
||||
@@ -1,68 +0,0 @@
|
||||
# -*- mode: python -*-
|
||||
|
||||
import sys
|
||||
|
||||
block_cipher = None
|
||||
console = True # <--- change this to True to enable command prompt when the app runs
|
||||
|
||||
if sys.platform.find("mac") or sys.platform.find("osx"):
|
||||
console = False
|
||||
|
||||
BINARY_SLUG = "DungeonRandomizer"
|
||||
|
||||
def recurse_for_py_files(names_so_far):
|
||||
returnvalue = []
|
||||
for name in os.listdir(os.path.join(*names_so_far)):
|
||||
if name != "__pycache__":
|
||||
subdir_name = os.path.join(*names_so_far, name)
|
||||
if os.path.isdir(subdir_name):
|
||||
new_name_list = names_so_far + [name]
|
||||
for filename in os.listdir(os.path.join(*new_name_list)):
|
||||
base_file,file_extension = os.path.splitext(filename)
|
||||
if file_extension == ".py":
|
||||
new_name = ".".join(new_name_list+[base_file])
|
||||
if not new_name in returnvalue:
|
||||
returnvalue.append(new_name)
|
||||
returnvalue.extend(recurse_for_py_files(new_name_list))
|
||||
returnvalue.append("PIL._tkinter_finder") #Linux needs this
|
||||
return returnvalue
|
||||
|
||||
hiddenimports = []
|
||||
binaries = []
|
||||
|
||||
a = Analysis([f"../{BINARY_SLUG}.py"],
|
||||
pathex=[],
|
||||
binaries=binaries,
|
||||
datas=[('../data/', 'data/')],
|
||||
hiddenimports=hiddenimports,
|
||||
hookspath=[],
|
||||
runtime_hooks=[],
|
||||
excludes=[],
|
||||
win_no_prefer_redirects=False,
|
||||
win_private_assemblies=False,
|
||||
cipher=block_cipher,
|
||||
noarchive=False)
|
||||
|
||||
# https://stackoverflow.com/questions/17034434/how-to-remove-exclude-modules-and-files-from-pyinstaller
|
||||
excluded_binaries = [
|
||||
'VCRUNTIME140.dll',
|
||||
'ucrtbase.dll',
|
||||
'msvcp140.dll',
|
||||
'mfc140u.dll']
|
||||
a.binaries = TOC([x for x in a.binaries if x[0] not in excluded_binaries])
|
||||
|
||||
pyz = PYZ(a.pure, a.zipped_data,
|
||||
cipher=block_cipher)
|
||||
exe = EXE(pyz,
|
||||
a.scripts,
|
||||
a.binaries,
|
||||
a.zipfiles,
|
||||
a.datas,
|
||||
[],
|
||||
name=BINARY_SLUG,
|
||||
debug=False,
|
||||
bootloader_ignore_signals=False,
|
||||
strip=False,
|
||||
upx=True,
|
||||
runtime_tmpdir=None,
|
||||
console=console )
|
||||
@@ -1,69 +0,0 @@
|
||||
# -*- mode: python -*-
|
||||
|
||||
import sys
|
||||
|
||||
block_cipher = None
|
||||
console = True # <--- change this to True to enable command prompt when the app runs
|
||||
|
||||
if sys.platform.find("mac") or sys.platform.find("osx"):
|
||||
console = False
|
||||
|
||||
BINARY_SLUG = "Gui"
|
||||
|
||||
def recurse_for_py_files(names_so_far):
|
||||
returnvalue = []
|
||||
for name in os.listdir(os.path.join(*names_so_far)):
|
||||
if name != "__pycache__":
|
||||
subdir_name = os.path.join(*names_so_far, name)
|
||||
if os.path.isdir(subdir_name):
|
||||
new_name_list = names_so_far + [name]
|
||||
for filename in os.listdir(os.path.join(*new_name_list)):
|
||||
base_file,file_extension = os.path.splitext(filename)
|
||||
if file_extension == ".py":
|
||||
new_name = ".".join(new_name_list+[base_file])
|
||||
if not new_name in returnvalue:
|
||||
returnvalue.append(new_name)
|
||||
returnvalue.extend(recurse_for_py_files(new_name_list))
|
||||
returnvalue.append("PIL._tkinter_finder") #Linux needs this
|
||||
return returnvalue
|
||||
|
||||
hiddenimports = []
|
||||
binaries = []
|
||||
|
||||
a = Analysis([f"../{BINARY_SLUG}.py"],
|
||||
pathex=[],
|
||||
binaries=binaries,
|
||||
datas=[('../data/', 'data/')],
|
||||
hiddenimports=hiddenimports,
|
||||
hookspath=[],
|
||||
runtime_hooks=[],
|
||||
excludes=[],
|
||||
win_no_prefer_redirects=False,
|
||||
win_private_assemblies=False,
|
||||
cipher=block_cipher,
|
||||
noarchive=False)
|
||||
|
||||
# https://stackoverflow.com/questions/17034434/how-to-remove-exclude-modules-and-files-from-pyinstaller
|
||||
excluded_binaries = [
|
||||
'VCRUNTIME140.dll',
|
||||
'ucrtbase.dll',
|
||||
'msvcp140.dll',
|
||||
'mfc140u.dll']
|
||||
a.binaries = TOC([x for x in a.binaries if x[0] not in excluded_binaries])
|
||||
|
||||
pyz = PYZ(a.pure, a.zipped_data,
|
||||
cipher=block_cipher)
|
||||
exe = EXE(pyz,
|
||||
a.scripts,
|
||||
a.binaries,
|
||||
a.zipfiles,
|
||||
a.datas,
|
||||
[],
|
||||
name=BINARY_SLUG,
|
||||
debug=False,
|
||||
bootloader_ignore_signals=False,
|
||||
icon='../data/ER.ico',
|
||||
strip=False,
|
||||
upx=True,
|
||||
runtime_tmpdir=None,
|
||||
console=console )
|
||||
98
source/Template.spec
Normal file
98
source/Template.spec
Normal file
@@ -0,0 +1,98 @@
|
||||
# -*- mode: python -*-
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
from json.decoder import JSONDecodeError
|
||||
from PyInstaller.utils.hooks import collect_submodules
|
||||
|
||||
block_cipher = None
|
||||
console = False # <--- change this to True to enable command prompt when the app runs
|
||||
|
||||
if sys.platform.find("mac") or sys.platform.find("osx"):
|
||||
console = True
|
||||
|
||||
BINARY_SLUG = "<BINARY_SLUG>"
|
||||
|
||||
|
||||
def recurse_for_py_files(names_so_far):
|
||||
# get py files
|
||||
returnvalue = []
|
||||
for name in os.listdir(os.path.join(*names_so_far)):
|
||||
# ignore __pycache__
|
||||
if name != "__pycache__":
|
||||
subdir_name = os.path.join(*names_so_far, name)
|
||||
if os.path.isdir(subdir_name):
|
||||
new_name_list = names_so_far + [name]
|
||||
for filename in os.listdir(os.path.join(*new_name_list)):
|
||||
base_file, file_extension = os.path.splitext(filename)
|
||||
# if it's a .py
|
||||
if file_extension == ".py":
|
||||
new_name = ".".join(new_name_list+[base_file])
|
||||
if not new_name in returnvalue:
|
||||
returnvalue.append(new_name)
|
||||
returnvalue.extend(recurse_for_py_files(new_name_list))
|
||||
return returnvalue
|
||||
|
||||
|
||||
hiddenimports = recurse_for_py_files(["source"])
|
||||
for hidden in (collect_submodules("pkg_resources")):
|
||||
hiddenimports.append(hidden)
|
||||
|
||||
a = Analysis(
|
||||
[f"../{BINARY_SLUG}.py"],
|
||||
pathex=[],
|
||||
binaries=[],
|
||||
datas=[('../data/', 'data/')],
|
||||
hiddenimports=hiddenimports,
|
||||
hookspath=[],
|
||||
runtime_hooks=[],
|
||||
excludes=[],
|
||||
win_no_prefer_redirects=False,
|
||||
win_private_assemblies=False,
|
||||
cipher=block_cipher,
|
||||
noarchive=False
|
||||
)
|
||||
|
||||
# https://stackoverflow.com/questions/17034434/how-to-remove-exclude-modules-and-files-from-pyinstaller
|
||||
excluded_binaries = [
|
||||
'mfc140u.dll',
|
||||
'msvcp140.dll',
|
||||
'ucrtbase.dll',
|
||||
'VCRUNTIME140.dll'
|
||||
]
|
||||
|
||||
# win is temperamental
|
||||
with open(os.path.join(".","resources","app","meta","manifests","excluded_dlls.json")) as dllsManifest:
|
||||
dlls = []
|
||||
try:
|
||||
dlls = json.load(dllsManifest)
|
||||
except JSONDecodeError as e:
|
||||
raise ValueError("Windows DLLs manifest malformed!")
|
||||
for dll in dlls:
|
||||
for submod in ["core", "crt"]:
|
||||
for ver in ["1-1-0", "1-1-1", "1-2-0", "2-1-0"]:
|
||||
excluded_binaries.append(f"api-ms-win-{submod}-{dll}-l{ver}.dll")
|
||||
|
||||
a.binaries = TOC([x for x in a.binaries if x[0] not in excluded_binaries])
|
||||
|
||||
pyz = PYZ(
|
||||
a.pure,
|
||||
a.zipped_data,
|
||||
cipher=block_cipher
|
||||
)
|
||||
exe = EXE(
|
||||
pyz,
|
||||
a.scripts,
|
||||
a.binaries,
|
||||
a.zipfiles,
|
||||
a.datas,
|
||||
[],
|
||||
name=BINARY_SLUG,
|
||||
debug=False,
|
||||
bootloader_ignore_signals=False,
|
||||
strip=False,
|
||||
upx=True,
|
||||
runtime_tmpdir=None,
|
||||
console=console
|
||||
)
|
||||
17
source/classes/appversion.py
Normal file
17
source/classes/appversion.py
Normal file
@@ -0,0 +1,17 @@
|
||||
import os
|
||||
|
||||
from OverworldShuffle import __version__
|
||||
OWR_VERSION = __version__
|
||||
|
||||
def write_appversion():
|
||||
APP_VERSION = OWR_VERSION
|
||||
if "-" in APP_VERSION:
|
||||
APP_VERSION = APP_VERSION[:APP_VERSION.find("-")]
|
||||
APP_VERSION_FILE = os.path.join(".","resources","app","meta","manifests","app_version.txt")
|
||||
with open(APP_VERSION_FILE,"w") as f:
|
||||
f.seek(0)
|
||||
f.truncate()
|
||||
f.write(APP_VERSION)
|
||||
|
||||
if __name__ == "__main__":
|
||||
write_appversion()
|
||||
@@ -1,16 +1,28 @@
|
||||
import platform, sys, os, subprocess
|
||||
import pkg_resources
|
||||
from datetime import datetime
|
||||
try:
|
||||
import pkg_resources
|
||||
except ModuleNotFoundError as e:
|
||||
pass
|
||||
import datetime
|
||||
|
||||
from Main import __version__
|
||||
DR_VERSION = __version__
|
||||
|
||||
from OverworldShuffle import __version__
|
||||
OWR_VERSION = __version__
|
||||
|
||||
PROJECT_NAME = "ALttP Overworld Randomizer"
|
||||
|
||||
def diagpad(str):
|
||||
return str.ljust(len("ALttP Door Randomizer Version") + 5,'.')
|
||||
return str.ljust(len(f"{PROJECT_NAME} Version") + 5,'.')
|
||||
|
||||
def output(APP_VERSION):
|
||||
def output():
|
||||
lines = [
|
||||
"ALttP Door Randomizer Diagnostics",
|
||||
f"{PROJECT_NAME} Diagnostics",
|
||||
"=================================",
|
||||
diagpad("UTC Time") + str(datetime.utcnow())[:19],
|
||||
diagpad("ALttP Door Randomizer Version") + APP_VERSION,
|
||||
diagpad("UTC Time") + str(datetime.datetime.now(datetime.UTC))[:19],
|
||||
diagpad("ALttP Door Randomizer Version") + DR_VERSION,
|
||||
diagpad(f"{PROJECT_NAME} Version") + OWR_VERSION,
|
||||
diagpad("Python Version") + platform.python_version()
|
||||
]
|
||||
lines.append(diagpad("OS Version") + "%s %s" % (platform.system(), platform.release()))
|
||||
@@ -35,6 +47,7 @@ def output(APP_VERSION):
|
||||
pkg = pkg.split("==")
|
||||
lines.append(diagpad(pkg[0]) + pkg[1])
|
||||
'''
|
||||
installed_packages = []
|
||||
installed_packages = [str(d) for d in pkg_resources.working_set] #this doesn't work from the .exe either, but it doesn't crash the program
|
||||
installed_packages.sort()
|
||||
for pkg in installed_packages:
|
||||
|
||||
@@ -154,7 +154,7 @@ def generation_page(parent,settings):
|
||||
diag.geometry(str(dims["window"]["width"]) + 'x' + str(dims["window"]["height"]))
|
||||
text = Text(diag, width=dims["textarea.characters"]["width"], height=dims["textarea.characters"]["height"])
|
||||
text.pack()
|
||||
text.insert(INSERT,"\n".join(diagnostics.output(__version__)))
|
||||
text.insert(INSERT,"\n".join(diagnostics.output()))
|
||||
# dialog button
|
||||
self.widgets[widget].pieces["button"] = Button(self.widgets[widget].pieces["frame"], text='Run Diagnostics', command=partial(diags))
|
||||
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
import subprocess
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
# Spec file
|
||||
SPEC_FILE = os.path.join(".", "source", "DungeonRandomizer.spec")
|
||||
|
||||
# Destination is current dir
|
||||
DEST_DIRECTORY = '.'
|
||||
|
||||
# Check for UPX
|
||||
if os.path.isdir("upx"):
|
||||
upx_string = "--upx-dir=upx"
|
||||
else:
|
||||
upx_string = ""
|
||||
|
||||
if os.path.isdir("build") and not sys.platform.find("mac") and not sys.platform.find("osx"):
|
||||
shutil.rmtree("build")
|
||||
|
||||
# Run pyinstaller for DungeonRandomizer
|
||||
subprocess.run(" ".join([f"pyinstaller {SPEC_FILE} ",
|
||||
upx_string,
|
||||
"-y ",
|
||||
f"--distpath {DEST_DIRECTORY} ",
|
||||
]),
|
||||
shell=True)
|
||||
@@ -1,27 +0,0 @@
|
||||
import subprocess
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
# Spec file
|
||||
SPEC_FILE = os.path.join(".", "source", "Gui.spec")
|
||||
|
||||
# Destination is current dir
|
||||
DEST_DIRECTORY = '.'
|
||||
|
||||
# Check for UPX
|
||||
if os.path.isdir("upx"):
|
||||
upx_string = "--upx-dir=upx"
|
||||
else:
|
||||
upx_string = ""
|
||||
|
||||
if os.path.isdir("build") and not sys.platform.find("mac") and not sys.platform.find("osx"):
|
||||
shutil.rmtree("build")
|
||||
|
||||
# Run pyinstaller for Gui
|
||||
subprocess.run(" ".join([f"pyinstaller {SPEC_FILE} ",
|
||||
upx_string,
|
||||
"-y ",
|
||||
f"--distpath {DEST_DIRECTORY} ",
|
||||
]),
|
||||
shell=True)
|
||||
155
source/meta/build.py
Normal file
155
source/meta/build.py
Normal file
@@ -0,0 +1,155 @@
|
||||
'''
|
||||
Build Entrypoints
|
||||
'''
|
||||
import json
|
||||
import platform
|
||||
import os # for checking for dirs
|
||||
import re
|
||||
from json.decoder import JSONDecodeError
|
||||
from subprocess import Popen, PIPE, STDOUT, CalledProcessError
|
||||
|
||||
DEST_DIRECTORY = "."
|
||||
|
||||
# UPX greatly reduces the filesize. You can get this utility from https://upx.github.io/
|
||||
# just place it in a subdirectory named "upx" and this script will find it
|
||||
UPX_DIR = "upx"
|
||||
if os.path.isdir(os.path.join(".", UPX_DIR)):
|
||||
upx_string = f"--upx-dir={UPX_DIR}"
|
||||
else:
|
||||
upx_string = ""
|
||||
GO = True
|
||||
DIFF_DLLS = False
|
||||
|
||||
# set a global var for Actions to try to read
|
||||
def set_output(name, value):
|
||||
with open(os.environ['GITHUB_OUTPUT'], 'a') as fh:
|
||||
print(f'{name}={value}', file=fh)
|
||||
|
||||
# build the thing
|
||||
def run_build(slug):
|
||||
global GO
|
||||
global DIFF_DLLS
|
||||
|
||||
print(f"Building '{slug}' via Python {platform.python_version()}")
|
||||
|
||||
# get template, mod to do the thing
|
||||
specTemplateFile = open(os.path.join(".","source","Template.spec"))
|
||||
specTemplate = specTemplateFile.read()
|
||||
specTemplateFile.close()
|
||||
with(open(os.path.join(".","source",f"{slug}.spec"), "w")) as specFile:
|
||||
print(f"Writing '{slug}' PyInstaller spec file")
|
||||
thisTemplate = specTemplate.replace("<BINARY_SLUG>", slug)
|
||||
specFile.write(thisTemplate)
|
||||
|
||||
PYINST_EXECUTABLE = "pyinstaller"
|
||||
args = [
|
||||
os.path.join("source", f"{slug}.spec").replace(os.sep, os.sep * 2),
|
||||
upx_string,
|
||||
"-y",
|
||||
f"--distpath={DEST_DIRECTORY}"
|
||||
]
|
||||
errs = []
|
||||
strs = []
|
||||
print("PyInstaller args: %s" % " ".join(args))
|
||||
cmd = [
|
||||
PYINST_EXECUTABLE,
|
||||
*args
|
||||
]
|
||||
|
||||
ret = {
|
||||
"stdout": [],
|
||||
"stderr": []
|
||||
}
|
||||
|
||||
with Popen(cmd, stdout=PIPE, stderr=STDOUT, bufsize=1, universal_newlines=True) as p:
|
||||
for line in p.stdout:
|
||||
ret["stdout"].append(line)
|
||||
print(line, end='')
|
||||
# if p.stderr:
|
||||
# for line in p.stderr:
|
||||
# ret["stderr"].append(line)
|
||||
# print(line, end='')
|
||||
# if p.returncode != 0:
|
||||
# raise CalledProcessError(p.returncode, p.args)
|
||||
|
||||
# check stdout & stderr
|
||||
for key in ["stdout","stderr"]:
|
||||
if len(ret[key]) > 0:
|
||||
for line in ret[key]:
|
||||
# UPX can't compress this file
|
||||
if "NotCompressibleException" in line.strip():
|
||||
print(line)
|
||||
errs.append(line.strip())
|
||||
# print UPX messages
|
||||
if "UPX" in line:
|
||||
print(line)
|
||||
# try to get DLL filename
|
||||
elif "NotCompressibleException" in line.strip():
|
||||
matches = re.search(r'api-ms-win-(?:[^-]*)-([^-]*)', line.strip())
|
||||
if matches:
|
||||
strAdd = matches.group(1)
|
||||
strs.append(strAdd)
|
||||
errs.append(line.strip())
|
||||
# print collected errors
|
||||
if len(errs) > 0:
|
||||
print("=" * 10)
|
||||
print("| ERRORS |")
|
||||
print("=" * 10)
|
||||
print("\n".join(errs))
|
||||
else:
|
||||
GO = False
|
||||
|
||||
# if we identified DLLs to ignore
|
||||
if len(strs) > 0:
|
||||
# read DLLs manifest that we've already got saved
|
||||
with open(os.path.join(".","resources","app","meta","manifests","excluded_dlls.json"), "w+", encoding="utf-8") as dllsManifest:
|
||||
oldDLLs = []
|
||||
try:
|
||||
oldDLLs = json.load(dllsManifest)
|
||||
except JSONDecodeError as e:
|
||||
oldDLLs = []
|
||||
# raise ValueError("Windows DLLs manifest malformed!")
|
||||
|
||||
# bucket for new list
|
||||
newDLLs = sorted(list(set(oldDLLs)))
|
||||
|
||||
# items to add
|
||||
addDLLs = sorted(list(set(strs)))
|
||||
|
||||
# add items
|
||||
newDLLs += addDLLs
|
||||
newDLLs = sorted(list(set(newDLLs)))
|
||||
|
||||
# if the lists differ, we've gotta update the included list
|
||||
diffDLLs = newDLLs != oldDLLs
|
||||
|
||||
if diffDLLs:
|
||||
DIFF_DLLS = True
|
||||
dllsManifest.seek(0)
|
||||
dllsManifest.truncate()
|
||||
dllsManifest.write(json.dumps(sorted(newDLLs), indent=2))
|
||||
|
||||
print(f"Old DLLs: {json.dumps(sorted(oldDLLs))}")
|
||||
print(f"Add DLLs: {json.dumps(sorted(addDLLs))}")
|
||||
print(f"New DLLs: {json.dumps(sorted(newDLLs))}")
|
||||
print(f"Diff DLLs: {DIFF_DLLS}")
|
||||
print("")
|
||||
|
||||
def go_build(slug):
|
||||
slug = slug or ""
|
||||
if slug != "":
|
||||
GO = True
|
||||
while GO:
|
||||
run_build(slug)
|
||||
GO = False
|
||||
|
||||
if __name__ == "__main__":
|
||||
binary_slugs = []
|
||||
#TODO: Make sure we've got the proper binaries that we need
|
||||
with open(os.path.join(".","resources","app","meta","manifests","binaries.json")) as binariesFile:
|
||||
binary_slugs = json.load(binariesFile)
|
||||
for file_slug in binary_slugs:
|
||||
go_build(file_slug)
|
||||
if DIFF_DLLS:
|
||||
print("🔴Had to update Error DLLs list!")
|
||||
exit(1)
|
||||
10
source/meta/check_errordlls.py
Normal file
10
source/meta/check_errordlls.py
Normal file
@@ -0,0 +1,10 @@
|
||||
import json
|
||||
import os
|
||||
|
||||
error_dlls_path = os.path.join(".","resources","app","meta","manifests","excluded_dlls.json")
|
||||
if os.path.isfile(error_dlls_path):
|
||||
with open(error_dlls_path, "r") as error_dlls_file:
|
||||
error_dlls_json = json.load(error_dlls_file)
|
||||
if len(error_dlls_json) > 0 and error_dlls_json[0].strip() != "":
|
||||
print(error_dlls_json)
|
||||
# exit(1)
|
||||
10
source/meta/run_diags.py
Normal file
10
source/meta/run_diags.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from source.classes import diags as diags
|
||||
|
||||
global VERBOSE
|
||||
VERBOSE = True
|
||||
|
||||
if __name__ == "__main__":
|
||||
if VERBOSE:
|
||||
print("DIAGNOSTICS")
|
||||
print('.' * 70)
|
||||
print("\n".join(diags.output()))
|
||||
@@ -1,3 +1,4 @@
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
import multiprocessing
|
||||
@@ -8,6 +9,16 @@ from collections import OrderedDict
|
||||
cpu_threads = multiprocessing.cpu_count()
|
||||
py_version = f"{sys.version_info.major}.{sys.version_info.minor}"
|
||||
|
||||
PYLINE = "python"
|
||||
PIPLINE_PATH = os.path.join(".","resources","user","meta","manifests","pipline.txt")
|
||||
if os.path.isfile(PIPLINE_PATH):
|
||||
with open(PIPLINE_PATH) as pipline_file:
|
||||
PYLINE = pipline_file.read().replace("-m pip","").strip()
|
||||
|
||||
results = {
|
||||
"errors": [],
|
||||
"success": []
|
||||
}
|
||||
|
||||
def main(args=None):
|
||||
successes = []
|
||||
@@ -25,7 +36,7 @@ def main(args=None):
|
||||
|
||||
def test(testname: str, command: str):
|
||||
tests[testname] = [command]
|
||||
basecommand = f"python3.8 Mystery.py --suppress_rom --suppress_meta"
|
||||
basecommand = f"{PYLINE} Mystery.py --suppress_rom --suppress_meta"
|
||||
|
||||
def gen_seed():
|
||||
taskcommand = basecommand + " " + command
|
||||
@@ -98,6 +109,10 @@ if __name__ == "__main__":
|
||||
|
||||
cpu_threads = args.cpu_threads
|
||||
|
||||
LOGPATH = os.path.join(".","logs")
|
||||
if not os.path.isdir(LOGPATH):
|
||||
os.makedirs(LOGPATH)
|
||||
|
||||
for dr in [['mystery', args.count if args.count else 1, 1]]:
|
||||
|
||||
for tense in range(1, dr[2] + 1):
|
||||
@@ -112,13 +127,36 @@ if __name__ == "__main__":
|
||||
print()
|
||||
|
||||
if errors:
|
||||
with open(f"{dr[0]}{(f'-{tense}' if dr[0] in ['basic', 'crossed'] else '')}-errors.txt", 'w') as stream:
|
||||
errors_filename = f"{dr[0]}"
|
||||
if dr[0] in ["basic","crossed"]:
|
||||
errors_filename += f"-{tense}"
|
||||
errors_filename += "-errors.txt"
|
||||
with open(
|
||||
os.path.join(
|
||||
LOGPATH,
|
||||
errors_filename
|
||||
),
|
||||
'w'
|
||||
) as stream:
|
||||
for error in errors:
|
||||
stream.write(error[0] + "\n")
|
||||
stream.write(error[1] + "\n")
|
||||
stream.write(error[2] + "\n\n")
|
||||
error[2] = error[2].split("\n")
|
||||
results["errors"].append(error)
|
||||
|
||||
with open("success.txt", "w") as stream:
|
||||
with open(os.path.join(LOGPATH, "mystery-success.txt"), "w") as stream:
|
||||
stream.write(str.join("\n", successes))
|
||||
results["success"] = successes
|
||||
|
||||
input("Press enter to continue")
|
||||
num_errors = len(results["errors"])
|
||||
num_success = len(results["success"])
|
||||
num_total = num_errors + num_success
|
||||
|
||||
print(f"Errors: {num_errors}/{num_total}")
|
||||
print(f"Success: {num_success}/{num_total}")
|
||||
# print(results)
|
||||
|
||||
if (num_errors/num_total) > (num_success/num_total):
|
||||
# exit(1)
|
||||
pass
|
||||
|
||||
@@ -10,6 +10,16 @@ from collections import OrderedDict
|
||||
cpu_threads = multiprocessing.cpu_count()
|
||||
py_version = f"{sys.version_info.major}.{sys.version_info.minor}"
|
||||
|
||||
PYLINE = "python"
|
||||
PIPLINE_PATH = os.path.join(".","resources","user","meta","manifests","pipline.txt")
|
||||
if os.path.isfile(PIPLINE_PATH):
|
||||
with open(PIPLINE_PATH) as pipline_file:
|
||||
PYLINE = pipline_file.read().replace("-m pip","").strip()
|
||||
|
||||
results = {
|
||||
"errors": [],
|
||||
"success": []
|
||||
}
|
||||
|
||||
def main(args=None):
|
||||
successes = []
|
||||
@@ -28,7 +38,7 @@ def main(args=None):
|
||||
def test(test_name: str, command: str, test_file: str):
|
||||
tests[test_name] = [command]
|
||||
|
||||
base_command = f"python3 DungeonRandomizer.py --suppress_rom --suppress_spoiler"
|
||||
base_command = f"{PYLINE} DungeonRandomizer.py --suppress_rom --jsonout --spoiler none"
|
||||
|
||||
def gen_seed():
|
||||
task_command = base_command + " " + command
|
||||
@@ -102,7 +112,7 @@ if __name__ == "__main__":
|
||||
|
||||
test_suites = {}
|
||||
# not sure if it supports subdirectories properly yet
|
||||
for root, dirnames, filenames in os.walk('test/suite'):
|
||||
for root, dirnames, filenames in os.walk(os.path.join("test","suite")):
|
||||
test_suites[root] = fnmatch.filter(filenames, '*.yaml')
|
||||
|
||||
args = argparse.Namespace()
|
||||
@@ -113,14 +123,30 @@ if __name__ == "__main__":
|
||||
successes += s
|
||||
print()
|
||||
|
||||
LOGPATH = os.path.join(".","logs")
|
||||
if not os.path.isdir(LOGPATH):
|
||||
os.makedirs(LOGPATH)
|
||||
|
||||
if errors:
|
||||
with open(f"new-test-suite-errors.txt", 'w') as stream:
|
||||
with open(os.path.join(LOGPATH, "new-test-suite-errors.txt"), 'w') as stream:
|
||||
for error in errors:
|
||||
stream.write(error[0] + "\n")
|
||||
stream.write(error[1] + "\n")
|
||||
stream.write(error[2] + "\n\n")
|
||||
error[2] = error[2].split("\n")
|
||||
results["errors"].append(error)
|
||||
|
||||
with open("new-test-suite-success.txt", "w") as stream:
|
||||
stream.write(str.join("\n", successes))
|
||||
results["success"] = successes
|
||||
|
||||
input("Press enter to continue")
|
||||
num_errors = len(results["errors"])
|
||||
num_success = len(results["success"])
|
||||
num_total = num_errors + num_success
|
||||
|
||||
print(f"Errors: {num_errors}/{num_total}")
|
||||
print(f"Success: {num_success}/{num_total}")
|
||||
# print(results)
|
||||
|
||||
if (num_errors/num_total) > (num_success/num_total):
|
||||
exit(1)
|
||||
|
||||
Reference in New Issue
Block a user