feat: create localized python strings scripts and utility functions
parent
12ae04b7ec
commit
7d8ba9320e
@ -0,0 +1,4 @@
|
||||
__pycache__
|
||||
/localization/analysis
|
||||
/localization/output
|
||||
/localization/input
|
@ -1,36 +1,80 @@
|
||||
**Those tools can be used to keep in sync our locale in the app between different language and with android translations**
|
||||
# Tools
|
||||
|
||||
## Using the Python scripts
|
||||
|
||||
The Python scripts are located in the `tools` directory. To run a script, use the following command:
|
||||
|
||||
## Step 1: Find unused key locales in EN
|
||||
```bash
|
||||
python3 ./tools/<script>.py
|
||||
```
|
||||
|
||||
`tools/unusedLocalizedString.py` is iterating over all root keys in _locales/en/message.json and try to find them on the code with a regex. If it does not find it, it will print a line with False.
|
||||
Some key exceptions are hardcoded to not report false negative
|
||||
Most of these scripts can take arguments. To see the arguments for a script, use the following command:
|
||||
|
||||
```bash
|
||||
python3 ./tools/<script>.py --help
|
||||
```
|
||||
|
||||
So just run:
|
||||
`tools/unusedLocalizedString.py |grep False`
|
||||
and double check by searching in the app if you can effectively remove those keys.
|
||||
## Utiltiy
|
||||
|
||||
### Sort JSON
|
||||
|
||||
## Step 2: Sync keys between each locales on desktop
|
||||
[./util/sortJson.py](./util/sortJson.py) sorts a given JSON file.
|
||||
|
||||
This step removes every key in all locales not found in the locale EN.
|
||||
So if for example, you have a key in `it` which is not present in `en`, it will be removed and the `it` file will be written without it.
|
||||
```bash
|
||||
python3 ./tools/util/sortJson.py <file>
|
||||
```
|
||||
|
||||
A summary for each language file is printed on the screen to let you know if anything was changed during the process
|
||||
## Localization
|
||||
|
||||
`python3 tools/compareLocalizedStrings.py`
|
||||
There are several script that handle localization at different stages.
|
||||
|
||||
### Find String
|
||||
|
||||
## Step 3: Map translations from android to desktop
|
||||
[findString.py](./findString.py) is a utility script that searches for a given token across the codebase. This script searches in the following directories:
|
||||
|
||||
This step matches translations from android to desktop. It needs to be run for each locale you want to update.
|
||||
- `./ts/`
|
||||
|
||||
```bash
|
||||
python3 ./tools/findString.py <token>
|
||||
```
|
||||
|
||||
`python3 tools/mapAndroidTranslationsToDesktop.py fr <path_to_android_root_project>`
|
||||
The script can automatically open the files in VSCode by passing the `--open` flag.
|
||||
|
||||
Under the hood, it uses a item from the EN desktop locale called `androidKey` to find the matching translation for each locale.
|
||||
```bash
|
||||
python3 ./tools/findString.py <token> --open
|
||||
```
|
||||
|
||||
Note that if a desktop key does not have an `androidKey` set, it will just be skipped
|
||||
The goal is to have an androidKey for each item, if possible. But for now apps are too different for that to make sense.
|
||||
**Warning:** The --open flag will open only the first result for the token in VSCode. If you wish to open more files, you can pass the `--limit` flag with the maximum number of files you wish to open. You can also pass the `--limit 0` flag to open all files containing the token.
|
||||
|
||||
```bash
|
||||
python3 ./tools/findString.py <token> --open --limit 5
|
||||
```
|
||||
|
||||
### [CrowdIn Post-Import](./localization/crowdInPostImport.sh)
|
||||
|
||||
When a CrowdIn PR is made to update the localizations the [./localization/crowdInPostInstall.sh](./localization/crowdInPostImport.sh) - This script processes the imported files by running the following scripts:
|
||||
|
||||
- [./util/sortJson.py](./util/sortJson.py) - This script sorts a json file and is run for all `messages.json` files locaed in `./_locales/`.
|
||||
- [./localization/generateLocales.py](./localization/generateLocales.py) - This script generates the TypeScript type definitions [locales.ts](../ts/localization/locales.ts). This script also validates the dynamic variables in each locale file and flags any errors.
|
||||
|
||||
The generated type file is not commited to the repository and is generated at build time. It is generated here to ensure that changes to any type definitions are not problematic.
|
||||
|
||||
## [Generate Localized Strings Analysis](./localization/generateLocalizedStringsAnalysis.sh)
|
||||
|
||||
This script generates a report of the localized strings, identifying missing and unused strings, as well as strings that are used but not known about. Without any input files this script outputs:
|
||||
|
||||
- [found_strings.csv] - A list of all strings found in the codebase.
|
||||
- [not_found_strings.csv] - A list of all strings not found in the codebase.
|
||||
- [potental_matches.csv] - A list of all not found strings in the codebase that have a potential match using a fuzzy search.
|
||||
|
||||
The script can be run with:
|
||||
|
||||
```bash
|
||||
python3 ./tools/localization/generateLocalizedStringsAnalysis.py
|
||||
```
|
||||
|
||||
The script can also take the following arguments:
|
||||
|
||||
- `--output-dir` - The directory to output the files to. Default is `./tools/localization/analysis/`.
|
||||
- `--master-strings` - A file containging a master list of strings to compare against. This list specifies the list of known strings. When this is provided a `missing_strings.csv` file is generated. This file contains all strings in the codebase that are not in the master list.
|
||||
- `--to-be-removed` - A file containging a list of strings that are to be removed from the codebase. This list specifies the list of strings that are to be removed and so won't be flagged as missing from the master lists. Any strings in this list will not appear in the `missing_strings.csv` file.
|
||||
|
@ -0,0 +1,23 @@
|
||||
#!/bin/sh
|
||||
|
||||
echo 'Cleaning up CrowdIn import'
|
||||
|
||||
SORT_JSON_FILE=$PWD/tools/util/sortJson.py
|
||||
GENERATE_LOCALES_FILE=$PWD/tools/localization/generateLocales.py
|
||||
|
||||
# Sort all the messages.json files
|
||||
for dir in $PWD/_locales/*/
|
||||
do
|
||||
dir=${dir%*/}
|
||||
file="${dir}/messages.json"
|
||||
if [ -f "$file" ]
|
||||
then
|
||||
python $SORT_JSON_FILE "$file"
|
||||
else
|
||||
echo "$file not found."
|
||||
|
||||
fi
|
||||
done
|
||||
|
||||
# Generate Types and find problems if the python script exists with a non-zero exit code then the build will fail
|
||||
python3 $GENERATE_LOCALES_FILE --print-problems --error-on-problems --error-old-dynamic-variables --print-old-dynamic-variables
|
@ -0,0 +1,215 @@
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
|
||||
# This allows for importing from the localization and util directories NOTE: Auto importing tools will also prepend the import paths with "tools." this will not work and needs to be removed from import paths
|
||||
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
|
||||
from util.listUtils import missingFromList
|
||||
from util.logger import console
|
||||
|
||||
|
||||
def extractDynamicVariables(input_string):
|
||||
"""
|
||||
Extracts dynamic variables from the input string.
|
||||
|
||||
Args:
|
||||
input_string (str): The string to extract dynamic variables from.
|
||||
|
||||
Returns:
|
||||
list: A list of dynamic variables found in the input string.
|
||||
"""
|
||||
pattern = r"\{(\w+)\}"
|
||||
matches = re.findall(pattern, input_string)
|
||||
console.debug(f"matches: {matches}")
|
||||
return matches
|
||||
|
||||
|
||||
def extractOldDynamicVariables(input_string):
|
||||
"""
|
||||
Extracts dynamic variables from the input string.
|
||||
|
||||
Args:
|
||||
input_string (str): The string to extract dynamic variables from.
|
||||
|
||||
Returns:
|
||||
list: A list of dynamic variables found in the input string.
|
||||
"""
|
||||
pattern = r"\$(\w+)\$"
|
||||
matches = re.findall(pattern, input_string)
|
||||
return matches
|
||||
|
||||
|
||||
def extractVariablesFromDict(input_dict):
|
||||
"""
|
||||
Reads through a dictionary of key-value pairs and creates a new dictionary
|
||||
where the value is just a list of dynamic variables found in the original value.
|
||||
|
||||
Args:
|
||||
input_dict (dict): The dictionary to extract dynamic variables from.
|
||||
|
||||
Returns:
|
||||
dict: A dictionary with the same keys as input_dict, but the values are lists of dynamic variables.
|
||||
"""
|
||||
output_dict_new = {}
|
||||
output_dict_old = {}
|
||||
for key, value in input_dict.items():
|
||||
console.debug(f"key: {key}, value: {value}")
|
||||
output_dict_new[key] = extractDynamicVariables(value)
|
||||
output_dict_old[key] = extractDynamicVariables(value)
|
||||
return output_dict_new, output_dict_old
|
||||
|
||||
|
||||
def identifyLocaleDyanmicVariableDifferences(locales):
|
||||
"""
|
||||
Identifies the differences between each locale's dynamic variables.
|
||||
|
||||
Args:
|
||||
locales (dict): A dictionary with keys being a locale name and values being a dictionary of locales.
|
||||
|
||||
Returns:
|
||||
dict: A dictionary with the same keys as locales, but the values are dictionaries of issues.
|
||||
"""
|
||||
master_locale = locales["en"]
|
||||
issues = {}
|
||||
|
||||
for locale_name, locale in locales.items():
|
||||
if locale_name == "en":
|
||||
continue
|
||||
|
||||
locale_issues = {
|
||||
"missing_keys": [],
|
||||
"additional_keys": [],
|
||||
"missing_variables": {},
|
||||
"additional_variables": {},
|
||||
}
|
||||
|
||||
for key, value in master_locale.items():
|
||||
# If a key is missing from the locale, add it to the missing_keys list
|
||||
if key not in locale:
|
||||
locale_issues["missing_keys"].append(key)
|
||||
else:
|
||||
|
||||
locale_value = locale[key]
|
||||
|
||||
# Find the dynamic variables that are missing from the locale. If there are none this will set the value to an empty list.
|
||||
locale_issues["missing_variables"][key] = missingFromList(
|
||||
value, locale_value
|
||||
)
|
||||
|
||||
# Find the dynamic variables that are additional to the locale. If there are none this will set the value to an empty list.
|
||||
locale_issues["additional_variables"][key] = missingFromList(
|
||||
locale_value, value
|
||||
)
|
||||
|
||||
for key in locale:
|
||||
if key not in master_locale:
|
||||
locale_issues["additional_keys"].append(key)
|
||||
|
||||
# Only add the locale to the issues if there are any issues
|
||||
if (
|
||||
locale_issues["missing_keys"]
|
||||
or locale_issues["additional_keys"]
|
||||
or locale_issues["missing_variables"]
|
||||
or locale_issues["additional_variables"]
|
||||
):
|
||||
|
||||
# Remove empty lists from missing_variables
|
||||
locale_issues["missing_variables"] = {
|
||||
k: v for k, v in locale_issues["missing_variables"].items() if v
|
||||
}
|
||||
|
||||
# Remove empty lists from additional_variables
|
||||
locale_issues["additional_variables"] = {
|
||||
k: v for k, v in locale_issues["additional_variables"].items() if v
|
||||
}
|
||||
|
||||
# remove missing_keys if it's empty
|
||||
if not locale_issues["missing_keys"]:
|
||||
del locale_issues["missing_keys"]
|
||||
|
||||
# remove additional_keys if it's empty
|
||||
if not locale_issues["additional_keys"]:
|
||||
del locale_issues["additional_keys"]
|
||||
|
||||
# Remove missing_variables if it's empty
|
||||
if not locale_issues["missing_variables"]:
|
||||
del locale_issues["missing_variables"]
|
||||
|
||||
# Remove additional_variables if it's empty
|
||||
if not locale_issues["additional_variables"]:
|
||||
del locale_issues["additional_variables"]
|
||||
|
||||
console.debug_json(f"locale_issues:", locale_issues)
|
||||
issues[locale_name] = locale_issues
|
||||
|
||||
return issues
|
||||
|
||||
|
||||
def prettyPrintIssuesTable(issues):
|
||||
"""
|
||||
Pretty prints a table from the return of identifyLocaleDyanmicVariableDifferences
|
||||
where the rows are locale name and the columns are the issue types.
|
||||
Values will be number of occurrences of each issues.
|
||||
|
||||
Args:
|
||||
issues (dict): The issues dictionary returned from identifyLocaleDyanmicVariableDifferences.
|
||||
|
||||
"""
|
||||
|
||||
PADDING = 10
|
||||
|
||||
# Print the header key
|
||||
print(
|
||||
f"\n{'-'*5*PADDING:<{PADDING}}\n\n"
|
||||
f"+ Keys: Keys present in the master locale but missing in the locale\n"
|
||||
f"- Keys: Keys present in the locale but missing in the master locale\n"
|
||||
f"- Vars: Dynamic variables present in the master locale but missing in the locale\n"
|
||||
f"+ Vars: Dynamic variables present in the locale but missing in the master locale\n"
|
||||
)
|
||||
|
||||
# Print the header
|
||||
print(
|
||||
f"{'Locale':<{PADDING}}{'+ Keys':<{PADDING}}{'- Keys':<{PADDING}}{'- Vars':<{PADDING}}{'+ Vars':<{PADDING}}\n"
|
||||
f"{'-'*5*PADDING:<{PADDING}}"
|
||||
)
|
||||
|
||||
for locale_name, locale_issues in issues.items():
|
||||
if locale_name == "en":
|
||||
continue
|
||||
|
||||
missing_keys = len(locale_issues.get("missing_keys", []))
|
||||
additional_keys = len(locale_issues.get("additional_keys", []))
|
||||
missing_variables = sum(
|
||||
len(v) for v in locale_issues.get("missing_variables", {}).values()
|
||||
)
|
||||
additional_variables = sum(
|
||||
len(v) for v in locale_issues.get("additional_variables", {}).values()
|
||||
)
|
||||
|
||||
print(
|
||||
f"{locale_name:<{PADDING}}{missing_keys:<{PADDING}}{additional_keys:<{PADDING}}{missing_variables:<{PADDING}}{additional_variables:<{PADDING}}"
|
||||
)
|
||||
|
||||
|
||||
def identifyAndPrintOldDynamicVariables(
|
||||
localeWithOldVariables, printOldVariables=False
|
||||
):
|
||||
"""
|
||||
Prints the keys that contain dynamic variables for each locale.
|
||||
|
||||
Args:
|
||||
localeWithOldVariables (dict): A dictionary with keys being a locale name and values being a dictionary of locales.
|
||||
"""
|
||||
found_problems = False
|
||||
for locale_name, locale in localeWithOldVariables.items():
|
||||
invalid_strings = dict()
|
||||
for key, value in locale.items():
|
||||
if value:
|
||||
invalid_strings[key] = value
|
||||
console.warn(
|
||||
f"{json.dumps(invalid_strings, indent=2, sort_keys=True) if printOldVariables else ''}"
|
||||
f"\nLocale {locale_name} contains {len(invalid_strings)} strings with old dynamic variables. (see above)"
|
||||
)
|
||||
found_problems = True
|
||||
return found_problems
|
@ -0,0 +1,73 @@
|
||||
#!/bin/python3
|
||||
import argparse
|
||||
import os
|
||||
import sys
|
||||
|
||||
# This allows for importing from the localization and util directories NOTE: Auto importing tools will also prepend the import paths with "tools." this will not work and needs to be removed from import paths
|
||||
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
|
||||
from localization.regex import localization_regex
|
||||
|
||||
|
||||
# Create the parser
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Search the codebase and find a localized string."
|
||||
)
|
||||
|
||||
# Add the arguments
|
||||
parser.add_argument("Token", metavar="token", type=str, help="the token to search for")
|
||||
parser.add_argument(
|
||||
"-o", "--open", action="store_true", help="Open the results in VSCode"
|
||||
)
|
||||
parser.add_argument(
|
||||
"-l",
|
||||
"--limit",
|
||||
type=int,
|
||||
default=1,
|
||||
help="Specify a maximum number of files to open",
|
||||
)
|
||||
|
||||
# Parse the arguments
|
||||
args = parser.parse_args()
|
||||
|
||||
TOKEN = args.Token
|
||||
EXCLUDE_FILES = ["LocalizerKeys.ts"]
|
||||
OPEN_IN_VSCODE = args.open
|
||||
NUMBER_OF_FILES_LIMIT = args.limit
|
||||
|
||||
|
||||
def find_token_uses(token, root_dir="./ts/", exclude_files=EXCLUDE_FILES):
|
||||
regex = localization_regex(token)
|
||||
matches = []
|
||||
|
||||
for root, dirs, files in os.walk(root_dir):
|
||||
for file in files:
|
||||
if file.endswith((".tsx", ".ts")) and file not in exclude_files:
|
||||
file_path = os.path.join(root, file)
|
||||
with open(file_path, "r") as f:
|
||||
for line_no, line in enumerate(f, start=1):
|
||||
if regex.search(line):
|
||||
matches.append(f"{file_path}:{line_no}")
|
||||
|
||||
return matches
|
||||
|
||||
|
||||
import os
|
||||
|
||||
matches = find_token_uses(TOKEN)
|
||||
if matches:
|
||||
print(f"Found {len(matches)} matches for token '{TOKEN}':")
|
||||
for match in matches:
|
||||
print(match)
|
||||
else:
|
||||
print(f"No matches found for token '{TOKEN}'")
|
||||
|
||||
if OPEN_IN_VSCODE:
|
||||
if NUMBER_OF_FILES_LIMIT > 0:
|
||||
if len(matches) > NUMBER_OF_FILES_LIMIT:
|
||||
print(
|
||||
f"Opening the first {NUMBER_OF_FILES_LIMIT} files (out of {len(matches)}). Use the -l flag to increase the limit. or -l 0 to open all files."
|
||||
)
|
||||
matches = matches[:NUMBER_OF_FILES_LIMIT]
|
||||
|
||||
for match in matches:
|
||||
os.system(f"code -g {match}")
|
@ -0,0 +1,128 @@
|
||||
#!/bin/python3
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
|
||||
|
||||
# This allows for importing from the localization and util directories NOTE: Auto importing tools will also prepend the import paths with "tools." this will not work and needs to be removed from import paths
|
||||
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
|
||||
|
||||
from util.time import ExecutionTimer;
|
||||
|
||||
timer = ExecutionTimer()
|
||||
|
||||
from dynamicVariables import (
|
||||
extractVariablesFromDict,
|
||||
identifyLocaleDyanmicVariableDifferences,
|
||||
prettyPrintIssuesTable,
|
||||
identifyAndPrintOldDynamicVariables,
|
||||
)
|
||||
from localization.localeTypes import generateLocalesType
|
||||
from util.logger import console
|
||||
from util.fileUtils import createMappedJsonFileDictionary, writeFile
|
||||
|
||||
|
||||
# If the --throw-error-on-missing flag is passed, the script will exit with an error if there are any missing keys or dynamic variables
|
||||
# This is useful for CI/CD pipelines to ensure that all translations are consistent
|
||||
parser = argparse.ArgumentParser(description="Generate locale files")
|
||||
parser.add_argument(
|
||||
"--error-on-problems",
|
||||
action="store_true",
|
||||
help="Exit with an error if there are any missing keys or dynamic variables",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--error-old-dynamic-variables",
|
||||
action="store_true",
|
||||
help="Exit with an error if there are any old dynamic variables",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--print-problems",
|
||||
action="store_true",
|
||||
help="Print the problems table",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--write-problems", action="store_true", help="Write the problems to a file"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--problems-file",
|
||||
default="./tools/localization/output/problems.json",
|
||||
help="The file to write the problems to",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--print-old-dynamic-variables",
|
||||
action="store_true",
|
||||
help="The file to write the problems to",
|
||||
)
|
||||
parser.add_argument("--debug", action="store_true", help="Enable debug mode")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.debug:
|
||||
console.enableDebug()
|
||||
|
||||
|
||||
EN_FILE = "./_locales/en/messages.json"
|
||||
OUTPUT_DIR = "./ts/localization"
|
||||
INPUT_DIR = "./_locales"
|
||||
|
||||
# Create a dictionary that maps locale names to their corresponding JSON file data
|
||||
locales, localeFiles = createMappedJsonFileDictionary(INPUT_DIR, "messages.json")
|
||||
|
||||
# Generate the locales type and write it to a file
|
||||
generateTypesOutputMessage = generateLocalesType(locales["en"])
|
||||
console.info(generateTypesOutputMessage)
|
||||
|
||||
localeVariables = dict()
|
||||
localeVariablesOld = dict()
|
||||
|
||||
# Extract the dynamic variables from each locale and store them in a dictionary
|
||||
for locale, data in locales.items():
|
||||
console.debug(f"Extracting dynamic variables for {locale}")
|
||||
(
|
||||
localeVariables[locale],
|
||||
localeVariablesOld[locale],
|
||||
) = extractVariablesFromDict(data)
|
||||
|
||||
|
||||
problems = identifyLocaleDyanmicVariableDifferences(localeVariables)
|
||||
|
||||
|
||||
found_old_dynamic_variables = identifyAndPrintOldDynamicVariables(
|
||||
localeVariablesOld, args.print_old_dynamic_variables
|
||||
)
|
||||
|
||||
# Wrapping up the script and printing out the results
|
||||
|
||||
console.info(generateTypesOutputMessage)
|
||||
|
||||
|
||||
if problems:
|
||||
message = "There are issues with the locales."
|
||||
if args.print_problems:
|
||||
prettyPrintIssuesTable(problems)
|
||||
message += " See above for details."
|
||||
|
||||
if args.write_problems:
|
||||
writeFile(args.problems_file, json.dumps(problems, indent=2))
|
||||
console.info(f"Problems written to {args.problems_file}")
|
||||
message += f" Problems written to {args.problems_file}"
|
||||
|
||||
if not args.print_problems and not args.write_problems:
|
||||
message += " Run the script with --print-problems or --write-problems to see the problems."
|
||||
|
||||
console.warn(message)
|
||||
|
||||
if found_old_dynamic_variables:
|
||||
console.warn(
|
||||
f"Old dynamic variables were found in the locales. Please update the locales to use the new dynamic variables. {f"See above for details{' (before the problems table).'if args.print_problems else '.'}" if args.print_old_dynamic_variables else 'Run the script with --print-old-dynamic-variables to see the old dynamic variables.'}"
|
||||
)
|
||||
|
||||
console.debug("Locales generation complete")
|
||||
|
||||
timer.stop()
|
||||
|
||||
if (args.error_on_problems and problems) or (
|
||||
args.error_old_dynamic_variables and found_old_dynamic_variables
|
||||
):
|
||||
sys.exit(1)
|
@ -0,0 +1,321 @@
|
||||
#!/bin/python3
|
||||
import os
|
||||
import sys
|
||||
import csv
|
||||
import re
|
||||
import glob
|
||||
import argparse
|
||||
import json
|
||||
|
||||
# This allows for importing from the localization and util directories NOTE: Auto importing tools will also prepend the import paths with "tools." this will not work and needs to be removed from import paths
|
||||
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
|
||||
|
||||
from util.time import ExecutionTimer
|
||||
|
||||
timer = ExecutionTimer()
|
||||
|
||||
from localization.regex import localization_regex
|
||||
from util.listUtils import missingFromSet, removeFromSet
|
||||
from util.fileUtils import makeDirIfNotExists, removeFileIfExists
|
||||
from util.logger import console
|
||||
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
"--debug", action="store_true", help="Enable debug mode, print debug messages"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--output-dir",
|
||||
type=str,
|
||||
default="./tools/localization/analysis",
|
||||
help="Output directory for the results",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--master-strings",
|
||||
type=str,
|
||||
default="./tools/localization/input/master_string_list.txt",
|
||||
help="Path to the master string list",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--to-be-removed",
|
||||
type=str,
|
||||
default="./tools/localization/input/to_be_removed_list.txt",
|
||||
help="Path to the list of strings to be removed",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Configuration
|
||||
intentionallyUnusedStrings = []
|
||||
DEBUG = args.debug
|
||||
|
||||
console.enableDebug() if DEBUG else None
|
||||
|
||||
OUTPUT_DIR = args.output_dir
|
||||
FOUND_STRINGS_PATH = os.path.join(OUTPUT_DIR, "found_strings.csv")
|
||||
NOT_FOUND_STRINGS_PATH = os.path.join(OUTPUT_DIR, "not_found_strings.txt")
|
||||
POTENTIAL_MATCHES_PATH = os.path.join(OUTPUT_DIR, "potential_matches.csv")
|
||||
NOT_IN_MASTER_LIST_PATH = os.path.join(OUTPUT_DIR, "not_in_master_list.csv")
|
||||
|
||||
EN_PATH = "_locales/en/messages.json"
|
||||
|
||||
MASTER_STRINGS_PATH = args.master_strings
|
||||
TO_BE_REMOVED_PATH = args.to_be_removed
|
||||
|
||||
# Remove files that are to be generated if they exist
|
||||
removeFileIfExists(FOUND_STRINGS_PATH)
|
||||
removeFileIfExists(NOT_FOUND_STRINGS_PATH)
|
||||
removeFileIfExists(POTENTIAL_MATCHES_PATH)
|
||||
removeFileIfExists(NOT_IN_MASTER_LIST_PATH)
|
||||
|
||||
|
||||
def flush():
|
||||
sys.stdout.flush() if not DEBUG else None
|
||||
|
||||
|
||||
# File search setup
|
||||
console.info("Scanning for localized strings...")
|
||||
files = []
|
||||
files_to_ignore = ["LocalizerKeys.ts"]
|
||||
ignore_patterns = [re.compile(pattern) for pattern in files_to_ignore]
|
||||
|
||||
console.debug(f"Ignoring files: {", ".join(files_to_ignore)}")
|
||||
|
||||
|
||||
def should_ignore_file(file_path):
|
||||
return any(pattern.search(file_path) for pattern in ignore_patterns)
|
||||
|
||||
|
||||
for extension in ("*.ts", "*.tsx"):
|
||||
files.extend(
|
||||
[
|
||||
y
|
||||
for x in os.walk("./ts/")
|
||||
for y in glob.glob(os.path.join(x[0], extension))
|
||||
if not should_ignore_file(y)
|
||||
]
|
||||
)
|
||||
|
||||
foundStringsAndLocations = {} # Dictionary to store found strings and their locations
|
||||
notFoundStrings = set() # Set to store not found strings
|
||||
total_files = len(files) * 1.1
|
||||
bar_length = 25
|
||||
|
||||
|
||||
def progress_bar(current, total, overallCurrent, overalTotal):
|
||||
if DEBUG:
|
||||
return
|
||||
percent = 100.0 * current / total
|
||||
percentOverall = 100.0 * overallCurrent / overalTotal
|
||||
sys.stdout.write("\r")
|
||||
sys.stdout.write(
|
||||
"Overall: [{:{}}] {:>3}% ".format(
|
||||
"=" * int(percentOverall / (100.0 / bar_length)),
|
||||
bar_length,
|
||||
int(percentOverall),
|
||||
)
|
||||
)
|
||||
sys.stdout.write(
|
||||
"Stage: [{:{}}] {:>3}%".format(
|
||||
"=" * int(percent / (100.0 / bar_length)), bar_length, int(percent)
|
||||
)
|
||||
)
|
||||
sys.stdout.flush()
|
||||
|
||||
|
||||
current_line_number = 0
|
||||
current_file_number = 0
|
||||
line_count = 0
|
||||
keys = []
|
||||
|
||||
|
||||
with open(EN_PATH, "r", encoding="utf-8") as messages_file:
|
||||
messages_dict = json.load(messages_file)
|
||||
|
||||
# Read json file and get all keys
|
||||
with open(EN_PATH, "r", encoding="utf-8") as messages_file:
|
||||
for line in messages_file:
|
||||
for match in re.finditer(r'"([^"]+)":', line):
|
||||
keys.append(match.group(1))
|
||||
|
||||
total_line_numbers = len(keys)
|
||||
console.debug(f"Total keys: {total_line_numbers}")
|
||||
|
||||
|
||||
def format_vscode_path(file_path):
|
||||
return file_path.replace("./", "")
|
||||
|
||||
|
||||
# search
|
||||
for key in keys:
|
||||
if key in intentionallyUnusedStrings:
|
||||
continue
|
||||
|
||||
searchedLine = localization_regex(key)
|
||||
|
||||
locations = []
|
||||
current_file_number = 0 # To keep track of the current file number for progress bar
|
||||
for file_path in files:
|
||||
with open(file_path, "r", encoding="utf-8") as file_content:
|
||||
content = file_content.read()
|
||||
for line_number, line in enumerate(content.split("\n"), start=1):
|
||||
if searchedLine.search(line):
|
||||
locations.append(f"{format_vscode_path(file_path)}:{line_number}")
|
||||
|
||||
current_file_number += 1
|
||||
progress_bar(
|
||||
current_file_number, total_files, current_line_number, total_line_numbers
|
||||
)
|
||||
current_line_number += 1
|
||||
if locations:
|
||||
console.debug(f"{key} - Found in {len(locations)}")
|
||||
foundStringsAndLocations[key] = locations
|
||||
else:
|
||||
console.debug(f"{key} - Not Found")
|
||||
notFoundStrings.add(key)
|
||||
|
||||
progress_bar(1, 1, 1, 1)
|
||||
|
||||
flush()
|
||||
|
||||
# Writing found strings and their locations to a CSV file
|
||||
makeDirIfNotExists(FOUND_STRINGS_PATH)
|
||||
with open(FOUND_STRINGS_PATH, "w", encoding="utf-8", newline="") as csvfile:
|
||||
csvwriter = csv.writer(csvfile)
|
||||
csvwriter.writerow(["String", "Phrase", "Locations"]) # Header row
|
||||
for foundString, locations in foundStringsAndLocations.items():
|
||||
# Write each found string and its locations. Locations are joined into a single string for CSV simplicity
|
||||
csvwriter.writerow(
|
||||
[foundString, messages_dict[foundString], "; ".join(locations)]
|
||||
)
|
||||
|
||||
# Writing not found strings to a text file as before
|
||||
makeDirIfNotExists(NOT_FOUND_STRINGS_PATH)
|
||||
with open(NOT_FOUND_STRINGS_PATH, "w", encoding="utf-8") as not_found_file:
|
||||
for notFound in notFoundStrings:
|
||||
not_found_file.write(f"{notFound}\n")
|
||||
|
||||
sys.stdout.write("\n")
|
||||
# Print the result statistics and file paths (linkable)
|
||||
console.info(f"Found {len(foundStringsAndLocations)} strings in {len(files)} files")
|
||||
console.info(f"Found strings and their locations written to: {FOUND_STRINGS_PATH}")
|
||||
|
||||
console.info(
|
||||
f"Identified {len(notFoundStrings)} not found strings and written to: {NOT_FOUND_STRINGS_PATH}"
|
||||
)
|
||||
|
||||
# Search for not found strings in any single quotes across all files
|
||||
console.info("Searching for potential matches for not found strings...")
|
||||
current_not_found_number = 0
|
||||
current_file_number = 0
|
||||
total_not_found_strings = len(notFoundStrings)
|
||||
potentialMatches = (
|
||||
{}
|
||||
) # Dictionary to store potential matches: {string: [file1, file2, ...]}
|
||||
for string in notFoundStrings:
|
||||
console.debug(f"Searching for: {string}")
|
||||
current_file_number = 0
|
||||
quotedStringPattern = re.compile(
|
||||
r"'{}'".format(string)
|
||||
) # Pattern to search for 'STRING'
|
||||
for file_path in files:
|
||||
with open(file_path, "r", encoding="utf-8") as file_content:
|
||||
if quotedStringPattern.search(file_content.read()):
|
||||
console.debug(f"Potential match found: {string} in {file_path}")
|
||||
if string not in potentialMatches:
|
||||
potentialMatches[string] = []
|
||||
potentialMatches[string].append(file_path)
|
||||
current_file_number += 1
|
||||
progress_bar(
|
||||
current_file_number,
|
||||
total_files,
|
||||
current_not_found_number,
|
||||
total_not_found_strings,
|
||||
)
|
||||
current_not_found_number += 1
|
||||
|
||||
|
||||
# Function to find the line numbers of matches within a specific file
|
||||
def find_line_numbers(file_path, pattern):
|
||||
line_numbers = []
|
||||
with open(file_path, "r", encoding="utf-8") as file:
|
||||
for i, line in enumerate(file, start=1):
|
||||
if pattern.search(line):
|
||||
line_numbers.append(i)
|
||||
return line_numbers
|
||||
|
||||
|
||||
# Process the found files to add line numbers
|
||||
for string, files in potentialMatches.items():
|
||||
for file_path in files:
|
||||
quotedStringPattern = re.compile(r"'{}'".format(string))
|
||||
line_numbers = find_line_numbers(file_path, quotedStringPattern)
|
||||
match_details = [f"{file_path}:{line}" for line in line_numbers]
|
||||
potentialMatches[string] = match_details # Update with detailed matches
|
||||
|
||||
# Writing potential matches to CSV, now with line numbers
|
||||
makeDirIfNotExists(POTENTIAL_MATCHES_PATH)
|
||||
with open(POTENTIAL_MATCHES_PATH, "w", encoding="utf-8", newline="") as csvfile:
|
||||
csvwriter = csv.writer(csvfile)
|
||||
csvwriter.writerow(["String", "Potential File Matches"])
|
||||
for string, matches in potentialMatches.items():
|
||||
csvwriter.writerow([string, "; ".join(matches)])
|
||||
|
||||
sys.stdout.write("\n")
|
||||
# Print the result statistics and file paths (linkable)
|
||||
console.info(
|
||||
f"Potential matches found for {len(potentialMatches)}/{len(notFoundStrings)} not found strings "
|
||||
)
|
||||
console.info(f"Potential matches written to: {POTENTIAL_MATCHES_PATH}")
|
||||
|
||||
# Identify found strings that are not in the master string list
|
||||
try:
|
||||
masterStringList = set()
|
||||
with open(MASTER_STRINGS_PATH, "r", encoding="utf-8") as masterListFile:
|
||||
for line in masterListFile:
|
||||
masterStringList.add(line.strip())
|
||||
|
||||
notInMasterList = missingFromSet(
|
||||
set(foundStringsAndLocations.keys()), masterStringList
|
||||
)
|
||||
|
||||
try:
|
||||
slatedForRemovalList = set()
|
||||
with open(TO_BE_REMOVED_PATH, "r", encoding="utf-8") as slatedForRemovalFile:
|
||||
for line in slatedForRemovalFile:
|
||||
slatedForRemovalList.add(line.strip())
|
||||
notInMasterList = removeFromSet(notInMasterList, slatedForRemovalList)
|
||||
except FileNotFoundError:
|
||||
console.warn(
|
||||
f"Strings to be removed list not found at: {TO_BE_REMOVED_PATH}. Skipping comparison."
|
||||
)
|
||||
|
||||
# Output the found strings not in the master list to a CSV file
|
||||
makeDirIfNotExists(NOT_IN_MASTER_LIST_PATH)
|
||||
with open(NOT_IN_MASTER_LIST_PATH, "w", encoding="utf-8", newline="") as csvfile:
|
||||
csvwriter = csv.writer(csvfile)
|
||||
csvwriter.writerow(["String", "Phrase", "Locations"]) # Header row
|
||||
for notInMaster in notInMasterList:
|
||||
# Write each found string and its locations. Locations are joined into a single string for CSV simplicity
|
||||
csvwriter.writerow(
|
||||
[
|
||||
notInMaster,
|
||||
messages_dict[notInMaster],
|
||||
"; ".join(foundStringsAndLocations[notInMaster]),
|
||||
]
|
||||
)
|
||||
console.info(f"Found {len(notInMasterList)} strings not in the master list")
|
||||
console.info(
|
||||
f"Found strings not in the master list written to: {NOT_IN_MASTER_LIST_PATH}"
|
||||
)
|
||||
except FileNotFoundError:
|
||||
console.warn(
|
||||
f"Master string list not found at: {MASTER_STRINGS_PATH}. Skipping comparison."
|
||||
)
|
||||
|
||||
if DEBUG:
|
||||
console.warn(
|
||||
"This script ran with debug enabled. Please disable debug mode for a cleaner output and faster execution."
|
||||
)
|
||||
|
||||
timer.stop()
|
@ -0,0 +1,72 @@
|
||||
#!/bin/python3
|
||||
import re
|
||||
|
||||
OUTPUT_FILE = "./ts/localization/locales.ts"
|
||||
|
||||
|
||||
def wrapValue(value):
|
||||
"""
|
||||
Wraps the given value in single quotes if it contains any characters other than letters, digits, or underscores.
|
||||
|
||||
Args:
|
||||
value (str): The value to be wrapped.
|
||||
|
||||
Returns:
|
||||
str: The wrapped value if it contains any special characters, otherwise the original value.
|
||||
"""
|
||||
if re.search(r"[^a-zA-Z0-9_]", value):
|
||||
return f"'{value}'"
|
||||
return value
|
||||
|
||||
|
||||
def parseValue(value):
|
||||
"""
|
||||
Parses the given value by replacing single quotes with escaped single quotes.
|
||||
|
||||
Args:
|
||||
value (str): The value to be parsed.
|
||||
|
||||
Returns:
|
||||
str: The parsed value with escaped single quotes.
|
||||
"""
|
||||
return value.replace("'", "\\'")
|
||||
|
||||
|
||||
def generate_js_object(data):
|
||||
"""
|
||||
Generate a JavaScript object from a dictionary.
|
||||
|
||||
Args:
|
||||
data (dict): The dictionary containing key-value pairs.
|
||||
|
||||
Returns:
|
||||
str: A string representation of the JavaScript object.
|
||||
"""
|
||||
js_object = "{\n"
|
||||
for key, value in data.items():
|
||||
js_object += f" {wrapValue(key)}: '{parseValue(value)}',\n"
|
||||
js_object += "}"
|
||||
return js_object
|
||||
|
||||
|
||||
DISCLAIMER = """
|
||||
// This file was generated by a script. Do not modify this file manually.
|
||||
// To make changes, modify the corresponding JSON file and re-run the script.
|
||||
|
||||
"""
|
||||
|
||||
|
||||
def generateLocalesType(locale):
|
||||
"""
|
||||
Generate the locales type and write it to a file.
|
||||
|
||||
Args:
|
||||
locale: The locale dictionary containing the localization data.
|
||||
"""
|
||||
# write the locale_dict to a file
|
||||
with open(OUTPUT_FILE, "w") as ts_file:
|
||||
ts_file.write(
|
||||
f"{DISCLAIMER}"
|
||||
f"const dict = {generate_js_object(locale)} as const;\nexport type Dictionary = typeof dict;"
|
||||
)
|
||||
return f"Locales generated at: {OUTPUT_FILE}"
|
@ -0,0 +1,16 @@
|
||||
import re
|
||||
|
||||
|
||||
def localization_regex(string):
|
||||
e_str = re.escape(string)
|
||||
|
||||
rex_b = r"i18n\([\r\n]?\s*'{}'|messages.{}|'{}'".format(e_str, e_str, e_str)
|
||||
rex_l = r"localizedKey\s*=\s*'{}'".format(e_str)
|
||||
res_8n = r"window\.i18n\(\s*'{}'(?:,\s*(?:[^\)]+?))?\s*\)".format(e_str)
|
||||
res_comp = r'<I18n\s+[^>]*?token=["\']{}["\'][^>]*?>'.format(e_str)
|
||||
res_token = r'token=["\']{}["\']'.format(e_str)
|
||||
|
||||
return re.compile(
|
||||
f"{rex_b}|{rex_l}|{res_8n}|{res_comp}|{res_token}",
|
||||
re.DOTALL,
|
||||
)
|
@ -1,55 +0,0 @@
|
||||
#!/bin/python
|
||||
|
||||
|
||||
# usage : ./tools/unusedLocalizedString.py |grep False
|
||||
|
||||
import re
|
||||
import os
|
||||
from glob import glob
|
||||
|
||||
# get all files matching .js, .ts and .tsx in ./
|
||||
dir_path = './'
|
||||
files = [y for x in os.walk(dir_path) for y in glob(os.path.join(x[0], '*.js'))]
|
||||
files += [y for x in os.walk(dir_path) for y in glob(os.path.join(x[0], '*.ts'))]
|
||||
files += [y for x in os.walk(dir_path) for y in glob(os.path.join(x[0], '*.tsx'))]
|
||||
|
||||
# exclude node_modules directories
|
||||
filtered_files = [f for f in files if "node_modules" not in f]
|
||||
|
||||
# search for this pattern in _locales/en/messages.json: it is a defined localized string
|
||||
patternLocalizedString = re.compile("^ \".*\"\: {")
|
||||
|
||||
localizedStringToSearch = 0
|
||||
localizedStringNotFound = 0
|
||||
for i, line in enumerate(open('_locales/en/messages.json')):
|
||||
for match in re.finditer(patternLocalizedString, line):
|
||||
localizedStringToSearch = localizedStringToSearch + 1
|
||||
found = match.group()
|
||||
# extract the key only from the line
|
||||
foundAline = found[3:-4]
|
||||
# print 'Found on line %s: \'%s\'' % (i + 1, foundAline)
|
||||
|
||||
# generate a new regex to be searched for to find its usage in the code
|
||||
# currently, it matches
|
||||
# * i18n('key') with or without line return
|
||||
# * messages.key (used in some places)
|
||||
# * and also 'key'. (some false positive might be present here)
|
||||
searchedLine = "i18n\([\r\n]?\s*'{0}'|messages.{0}|'{0}'".format(foundAline)
|
||||
|
||||
|
||||
found = False
|
||||
# skip timerOptions string constructed dynamically
|
||||
if 'timerOption_' in foundAline:
|
||||
found = True
|
||||
else:
|
||||
for file_path in filtered_files:
|
||||
fileContent = open(file_path, 'r').read()
|
||||
if len(re.findall(searchedLine,fileContent,re.MULTILINE)) > 0:
|
||||
found = True
|
||||
break
|
||||
if not found:
|
||||
localizedStringNotFound = localizedStringNotFound + 1
|
||||
print "i18n for '{0}': found:{1}:".format(foundAline, found)
|
||||
|
||||
print "number of localized string found in messages.json:{0}".format(localizedStringToSearch)
|
||||
print "number of localized string NOT found:{0}".format(localizedStringNotFound)
|
@ -1,31 +0,0 @@
|
||||
#!/bin/python3
|
||||
|
||||
|
||||
import re
|
||||
from os import path, listdir
|
||||
from glob import glob
|
||||
import json
|
||||
import sys
|
||||
from collections import OrderedDict
|
||||
|
||||
LOCALES_FOLDER = './_locales'
|
||||
|
||||
EN_FILE = LOCALES_FOLDER + '/en/messages.json'
|
||||
|
||||
LOCALIZED_KEYS_FILE = './ts/types/LocalizerKeys.ts'
|
||||
|
||||
stringToWrite = "export type LocalizerKeys =\n | "
|
||||
|
||||
with open(EN_FILE,'r') as jsonFile:
|
||||
data = json.loads(jsonFile.read(), object_pairs_hook=OrderedDict)
|
||||
keys = sorted(list(data.keys()))
|
||||
|
||||
stringToWrite += json.dumps(keys, sort_keys=True).replace(',', '\n |').replace('"', '\'')[1:-1]
|
||||
|
||||
|
||||
stringToWrite += ';\n'
|
||||
# print(stringToWrite)
|
||||
with open(LOCALIZED_KEYS_FILE, "w") as typeFile:
|
||||
typeFile.write(stringToWrite)
|
||||
|
||||
print('Updated LocalizerKeys.ts')
|
@ -0,0 +1,80 @@
|
||||
import json
|
||||
import os
|
||||
|
||||
|
||||
def createMappedJsonFileDictionary(inputDir, fileName):
|
||||
"""
|
||||
This function creates a dictionary that maps sub-directory names to their corresponding JSON file data.
|
||||
|
||||
Args:
|
||||
inputDir (str): The path to the input directory containing sub-directories.
|
||||
fileName (str): The name of the JSON file to be read for each sub-directory.
|
||||
|
||||
Returns:
|
||||
tuple: A tuple containing two dictionaries:
|
||||
- The first dictionary maps sub-directory names (with hyphens replaced by underscores) to their JSON data.
|
||||
- The second dictionary maps sub-directory names (with hyphens replaced by underscores) to the file paths of their JSON files.
|
||||
"""
|
||||
|
||||
# Get a list of all directories in the input directory
|
||||
files = [
|
||||
name
|
||||
for name in os.listdir(inputDir)
|
||||
if os.path.isdir(os.path.join(inputDir, name))
|
||||
]
|
||||
|
||||
# Initialize dictionaries to hold JSON data and file paths
|
||||
dictionary = dict()
|
||||
dictionaryKeyFiles = dict()
|
||||
|
||||
# Iterate over each sub-directory
|
||||
for filePath in files:
|
||||
# Replace hyphens in the directory name with underscores to create the dictionary key
|
||||
key = filePath.replace("-", "_")
|
||||
|
||||
# Construct the full path to the JSON file in this sub-directory
|
||||
filePath = os.path.join(inputDir, filePath, fileName)
|
||||
|
||||
# Store the file path in the dictionaryKeyFiles dictionary
|
||||
dictionaryKeyFiles[key] = filePath
|
||||
|
||||
# Open the JSON file and load the data into the dictionary
|
||||
with open(filePath, "r") as jsonFile:
|
||||
dictionary[key] = json.load(jsonFile)
|
||||
|
||||
# Return the dictionaries containing the JSON data and file paths
|
||||
return dictionary, dictionaryKeyFiles
|
||||
|
||||
|
||||
def makeDirIfNotExists(filePath):
|
||||
"""
|
||||
This function creates a directory if it does not already exist.
|
||||
|
||||
Args:
|
||||
dirPath (str): The path to the directory to create.
|
||||
"""
|
||||
os.makedirs(os.path.dirname(filePath), exist_ok=True)
|
||||
|
||||
|
||||
def writeFile(filePath, data):
|
||||
"""
|
||||
This function writes data to a file. Creating its parent directories if they do not exist.
|
||||
|
||||
Args:
|
||||
filePath (str): The path to the file to write the data to.
|
||||
data (str): The data to write to the file.
|
||||
"""
|
||||
makeDirIfNotExists(filePath)
|
||||
with open(filePath, "w") as file:
|
||||
file.write(data)
|
||||
|
||||
|
||||
def removeFileIfExists(filePath):
|
||||
"""
|
||||
This function removes a file if it exists.
|
||||
|
||||
Args:
|
||||
filePath (str): The path to the file to remove.
|
||||
"""
|
||||
if os.path.exists(filePath) and os.path.isfile(filePath):
|
||||
os.remove(filePath)
|
@ -0,0 +1,40 @@
|
||||
def missingFromList(list1, list2):
|
||||
"""
|
||||
Returns a new list containing the elements that are present in list1 but not in list2.
|
||||
|
||||
Args:
|
||||
list1 (list): The first list.
|
||||
list2 (list): The second list.
|
||||
|
||||
Returns:
|
||||
list: A new list containing the elements that are present in list1 but not in list2.
|
||||
"""
|
||||
return [item for item in set(list1) if item not in set(list2)]
|
||||
|
||||
|
||||
def missingFromSet(set1, set2):
|
||||
"""
|
||||
Returns a new set containing the elements that are present in set1 but not in set2.
|
||||
|
||||
Args:
|
||||
set1 (set): The first set.
|
||||
set2 (set): The second set.
|
||||
|
||||
Returns:
|
||||
set: A new set containing the elements that are present in set1 but not in set2.
|
||||
"""
|
||||
return {item for item in set1 if item not in set2}
|
||||
|
||||
|
||||
def removeFromSet(set1, set2):
|
||||
"""
|
||||
Removes the elements that are present in set2 from set1.
|
||||
|
||||
Args:
|
||||
set1 (set): The first set.
|
||||
set2 (set): The second set.
|
||||
|
||||
Returns:
|
||||
set: A new set containing the elements of set1 after removing the elements of set2.
|
||||
"""
|
||||
return {item for item in set1 if item not in set2}
|
@ -0,0 +1,39 @@
|
||||
import os
|
||||
import sys
|
||||
|
||||
# This allows for importing from the localization and util directories NOTE: Auto importing tools will also prepend the import paths with "tools." this will not work and needs to be removed from import paths
|
||||
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), "..")))
|
||||
from util.print import print_json
|
||||
|
||||
global DEBUG
|
||||
DEBUG = False
|
||||
|
||||
|
||||
class console:
|
||||
|
||||
def enableDebug():
|
||||
global DEBUG
|
||||
DEBUG = True
|
||||
console.debug("Debug mode enabled")
|
||||
|
||||
def log(msg):
|
||||
print(msg)
|
||||
|
||||
def debug(msg):
|
||||
if DEBUG:
|
||||
print(f"[DEBUG] {msg}")
|
||||
|
||||
def info(msg):
|
||||
print(f"[INFO] {msg}")
|
||||
|
||||
def warn(msg):
|
||||
print(f"[WARN] {msg}")
|
||||
|
||||
def debug_json(msg, json_data):
|
||||
if DEBUG:
|
||||
print(msg)
|
||||
print_json(json_data, sort_keys=True)
|
||||
|
||||
def info_json(msg, json_data):
|
||||
print(f"[INFO] {msg}")
|
||||
print_json(json_data, sort_keys=False)
|
@ -0,0 +1,5 @@
|
||||
import json
|
||||
|
||||
|
||||
def print_json(data, sort_keys=False):
|
||||
print(json.dumps(data, sort_keys=sort_keys, indent=2))
|
@ -0,0 +1,38 @@
|
||||
#!/bin/python3
|
||||
import json
|
||||
import argparse
|
||||
|
||||
|
||||
# Create the parser
|
||||
parser = argparse.ArgumentParser(description="Sort a JSON file.")
|
||||
|
||||
# Add the arguments
|
||||
parser.add_argument(
|
||||
"InputFile", metavar="inputfile", type=str, help="the input JSON file"
|
||||
)
|
||||
parser.add_argument(
|
||||
"-o",
|
||||
metavar="outputfile",
|
||||
type=str,
|
||||
nargs="?",
|
||||
default="",
|
||||
help="the output JSON file (optional)",
|
||||
)
|
||||
|
||||
# Parse the arguments
|
||||
args = parser.parse_args()
|
||||
|
||||
INPUT_FILE = args.InputFile
|
||||
OUTPUT_FILE = args.o if args.o else INPUT_FILE
|
||||
|
||||
# Load the JSON data from the input file
|
||||
with open(INPUT_FILE, "r") as f:
|
||||
data = json.load(f)
|
||||
|
||||
# Sort the JSON data
|
||||
sorted_data = json.dumps(data, sort_keys=True, indent=2)
|
||||
|
||||
with open(OUTPUT_FILE, "w") as f:
|
||||
f.write(sorted_data)
|
||||
|
||||
print(f"Sorted JSON data written to {OUTPUT_FILE}")
|
@ -0,0 +1,24 @@
|
||||
import time
|
||||
|
||||
|
||||
class ExecutionTimer:
|
||||
def __init__(self):
|
||||
self.start_time = None
|
||||
self.start()
|
||||
|
||||
def start(self):
|
||||
if self.start_time is not None:
|
||||
print("Timer is already running. Use .stop() to stop it")
|
||||
return
|
||||
|
||||
self.start_time = time.time()
|
||||
|
||||
def stop(self):
|
||||
if self.start_time is None:
|
||||
print("Timer is not running. Use .start() to start it")
|
||||
return
|
||||
|
||||
elapsed_time = time.time() - self.start_time
|
||||
self.start_time = None
|
||||
formatted_time = "{:.2f}".format(elapsed_time)
|
||||
print(f"Elapsed time: {formatted_time} seconds")
|
Loading…
Reference in New Issue