From 1991d081cbbb0807df73ebadd1f9d4d880e1d473 Mon Sep 17 00:00:00 2001 From: Torrin Leonard <82110564+torrinworx@users.noreply.github.com> Date: Wed, 10 Aug 2022 22:32:35 -0400 Subject: [PATCH 1/7] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 1536408..a2fc350 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,5 @@

- +

Blend_My_NFTs

From 752fa4b6e96447430847145eeeec014de77067a4 Mon Sep 17 00:00:00 2001 From: Torrin Leonard <82110564+torrinworx@users.noreply.github.com> Date: Wed, 10 Aug 2022 22:47:47 -0400 Subject: [PATCH 2/7] Update README.md --- README.md | 112 ++++++++++++++++++++---------------------------------- 1 file changed, 41 insertions(+), 71 deletions(-) diff --git a/README.md b/README.md index a2fc350..e9a7a35 100644 --- a/README.md +++ b/README.md @@ -388,25 +388,30 @@ If you need help creating a JSON file, checkout this tutorial: [How to Create JS To learn more about JSON files and how to structure data read this article: [Working with JSON](https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Objects/JSON) -Material Randomizer compatable materials must follow this naming convention: `__` - ## Material Randomizer JSON Schema If you'd like, copy and paste this template into the JSON file you created above: ``` { - "": { - "Variant Objects":["", ""], - "Material List": ["__", "__"] - }, - "Red Cone_1_0": { - "Variant Objects":["", ""], - "Material List": ["__", "__"] + "": { + "Material List": { + "": , + "": , + "": , + "": + }, + "Variant Objects": [ + "", + "", + "", + "" + ] } } ``` +TODO: Add more detail to this section regarding the new material system, for now reference this commit: https://github.com/torrinworx/Blend_My_NFTs/commit/3cb2a69c81932e5b882b69c4c4bfbdd153a2dfee # Custom Metadata Fields @@ -466,103 +471,68 @@ If you'd like, copy and paste this template into the JSON file you created above ``` { "Rule-1":{ - "Items-1": [ - "" + "IF": [ + "" ], - "Rule-Type": "", - "Items-2":[ - "" - ] - }, - "Rule-2":{ - "Items-1": [ - "" - ], - "Rule-Type": "", - "Items-2":[ - "" + "THEN":[ + "", "" ] } + "Rule-2":{ + "IF": [ + "" + ], + "NOT":[ + "", "" + ] + } } ``` ### Schema Definition - ``Rule-#`` A dictionary representing the information of a single defined Rule of an NFT collection. There can be as many as you choose. Increment the ``#`` when you create a new rule. -- ``Items-1`` A list of strings representing the names of Attribute(s) or Variant(s). -- ``Rule-Type`` The rule that governs the relation between ``Items-1`` and ``Items-2``. Has two possible values: ``Never with`` and ``Only with``. -- ``Items-2`` A list of strings representing the names of Attribute(s) or Variant(s). +- ``IF`` A single String in a list representing the Variant the rule is based off of. +- ``THEN`` A list of Attributes or Variants that always appear IF the ``IF`` Variant is selected. +- ``NOT`` A list of Attributes or Variants that never appear IF the ``IF`` Variant is selected. + +* Note: ``NOT`` is not currently stable, it's recommended to only use a Single variant per NOT rule to see consistant results. + ## Example Logic.json File Say we have the following scene in a .blend file: Screen Shot 2022-03-13 at 4 21 52 PM Note that we have two Attributes, ``Cube`` and ``Sphere``, and that they have 4 Variants. If you'd like to follow along with this example I'd recommend downloading the [Logic_Example.blend](https://github.com/torrinworx/BMNFTs_Examples/blob/main/Logic_Example.blend). -### Never With, Logic Rule Examples -- **Never With, Variants example:** - In this example, the Variant ``Red Cube_1_25`` never appears with ``Red Sphere_1_25``: +### Logic Examples +- **IF/THEN Rule setup** + In this example, the Variant ``Red Cube_1_25`` will always appear with ``Red Sphere_1_25``: ``` { "Rule-1":{ - "Items-1": [ + "IF": [ "Red Cube_1_25" ], - "Rule-Type": "Never With", - "Items-2":[ + "THEN":[ "Red Sphere_1_25" ] } } ``` - - -- **Never With, Attributes example:** - In this example, the Attribute ``Cube`` never appears with ``Red Sphere_1_25``. When ``Red Sphere_1_25`` is selected, no Variants in the Cube Attribute are selected, and hence the Attribute is set to "Empty": +- **IF/NOT Rule setup** + In this example, the Variant ``Red Cube_1_25`` will never appear with ``Red Sphere_1_25``: ``` { "Rule-1":{ - "Items-1": [ - "Cube" - ], - "Rule-Type": "Never With", - "Items-2":[ - "Red Sphere_1_25" - ] - } - } - ``` - -### Only With, Logic Rule Examples -- **Only With, Variants example:** - In this example, the Variant ``Red Cube_1_25`` only appears with ``Red Sphere_1_25``: - ``` - { - "Rule-1":{ - "Items-1": [ + "IF": [ "Red Cube_1_25" ], - "Rule-Type": "Only With", - "Items-2":[ + "NOT":[ "Red Sphere_1_25" ] } } ``` -- **Only With, Attributes example:** - In this example, the Attribute ``Cube`` only appears with ``Red Sphere_1_25``: - ``` - { - "Rule-1":{ - "Items-1": [ - "Cube" - ], - "Rule-Type": "Never With", - "Items-2":[ - "Red Sphere_1_25" - ] - } - } - ``` Now that you have a completed Logic.json file, you can now go back and complete [Step 1. Create Data](#step-1---create-nft-data)! From 4f42106fa3245d42c00047981f159d7652eebd77 Mon Sep 17 00:00:00 2001 From: Torrin Leonard <82110564+torrinworx@users.noreply.github.com> Date: Thu, 11 Aug 2022 09:48:01 -0400 Subject: [PATCH 3/7] Fixing merge issues Minor bug fixes to exporter and cleaning up code. --- UILists/Logic_UIList.py | 6 +- __init__.py | 484 ++++++++++++++++++++------------- main/Constants.py | 17 +- main/DNA_Generator.py | 533 ++++++++++++++++++------------------- main/Exporter.py | 169 +++++++++--- main/Intermediate.py | 49 ++-- main/Logic.py | 418 +++++++++++++++++------------ main/Material_Generator.py | 72 +++-- main/Rarity.py | 24 +- 9 files changed, 1035 insertions(+), 737 deletions(-) diff --git a/UILists/Logic_UIList.py b/UILists/Logic_UIList.py index 4207348..8bed03f 100644 --- a/UILists/Logic_UIList.py +++ b/UILists/Logic_UIList.py @@ -117,10 +117,8 @@ class CUSTOM_logic_objectCollection(PropertyGroup): name="Rule Type", description="Select the Rule Type", items=[ - ('Never With', "Never With", ""), - ('Only With', "Only With", ""), - ('Always With', "Always With", ""), - + ('THEN', "Then", ""), + ('NOT', "Not", ""), ] ) item_list2: StringProperty(default="Item List 2") diff --git a/__init__.py b/__init__.py index b4c7472..6d4a0e8 100644 --- a/__init__.py +++ b/__init__.py @@ -1,15 +1,18 @@ bl_info = { "name": "Blend_My_NFTs", "author": "Torrin Leonard, This Cozy Studio Inc", - "version": (4, 0, 2), - "blender": (3, 2, 0), + "version": (4, 5, 0), + "blender": (3, 2, 2), "location": "View3D", - "description": "An open source, free to use Blender add-on that enables you to create thousands of unique images, animations, and 3D models.", + "description": "A free and opensource Blender add-on that enables you to create thousands of unique images, animations, and 3D models.", + "support": "COMMUNITY", + "doc_url": "https://github.com/torrinworx/Blend_My_NFTs", + "tracker_url": "https://github.com/torrinworx/Blend_My_NFTs/issues/new", "category": "Development", } -BMNFTS_VERSION = "v4.0.2" -LAST_UPDATED = "8:19AM, May 31st, 2022" +BMNFTS_VERSION = "v4.5.0" +LAST_UPDATED = "8:19AM, Aug 11th, 2022" # ======== Import handling ======== # @@ -18,17 +21,20 @@ from bpy.app.handlers import persistent from bpy.props import (IntProperty, BoolProperty, CollectionProperty) - +# Python modules: import os import sys import json import importlib -from dataclasses import dataclass +import traceback from typing import Any +from dataclasses import dataclass +from datetime import datetime, timezone # "a little hacky bs" - matt159 ;) sys.path.append(os.path.dirname(os.path.realpath(__file__))) +# Local file imports: from main import \ Checks, \ DNA_Generator, \ @@ -74,7 +80,41 @@ if "bpy" in locals(): # Used for updating text and buttons in UI panels combinations: int = 0 recommended_limit: int = 0 +dt = datetime.now(timezone.utc).astimezone() # Date Time in UTC local + +@persistent +def Refresh_UI(dummy1, dummy2): + """ + Refreshes the UI upon user interacting with Blender (using depsgraph_update_post handler). Might be a better handler + to use. + """ + global combinations + global recommended_limit + + combinations = (get_combinations.get_combinations()) + recommended_limit = int(round(combinations / 2)) + + # Add panel classes that require refresh to this refresh_panels tuple: + refresh_panel_classes = ( + BMNFTS_PT_CreateData, + ) + + def redraw_panel(panels): + for i in panels: + try: + bpy.utils.unregister_class(i) + except Exception: + print(traceback.format_exc()) + bpy.utils.register_class(i) + + redraw_panel(refresh_panel_classes) + + +bpy.app.handlers.depsgraph_update_post.append(Refresh_UI) + + +# ======== Defining BMNFTs Data ======== # @dataclass class BMNFTData: nftName: str @@ -115,6 +155,17 @@ class BMNFTData: enableRarity: bool + enableAutoShutdown: bool + + specify_timeBool: bool + hours: int + minutes: int + + emailNotificationBool: bool + sender_from: str + email_password: str + receiver_to: str + custom_Fields: dict = None fail_state: Any = False failed_batch: Any = None @@ -124,82 +175,64 @@ class BMNFTData: def __post_init__(self): self.custom_Fields = {} + def getBMNFTData(): _save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path) _Blend_My_NFTs_Output, _batch_json_save_path, _nftBatch_save_path = make_directories(_save_path) - - data = BMNFTData ( - nftName = bpy.context.scene.input_tool.nftName, - save_path = _save_path, - nftsPerBatch = bpy.context.scene.input_tool.nftsPerBatch, - batchToGenerate = bpy.context.scene.input_tool.batchToGenerate, - collectionSize = bpy.context.scene.input_tool.collectionSize, - enableRarity = bpy.context.scene.input_tool.enableRarity, + data = BMNFTData( + nftName=bpy.context.scene.input_tool.nftName, + save_path=_save_path, + nftsPerBatch=bpy.context.scene.input_tool.nftsPerBatch, + batchToGenerate=bpy.context.scene.input_tool.batchToGenerate, + collectionSize=bpy.context.scene.input_tool.collectionSize, - Blend_My_NFTs_Output = _Blend_My_NFTs_Output, - batch_json_save_path = _batch_json_save_path, - nftBatch_save_path = _nftBatch_save_path, + enableRarity=bpy.context.scene.input_tool.enableRarity, - enableLogic = bpy.context.scene.input_tool.enableLogic, - enable_Logic_Json = bpy.context.scene.input_tool.enable_Logic_Json, - logicFile = bpy.context.scene.input_tool.logicFile, + Blend_My_NFTs_Output=_Blend_My_NFTs_Output, + batch_json_save_path=_batch_json_save_path, + nftBatch_save_path=_nftBatch_save_path, - enableImages = bpy.context.scene.input_tool.imageBool, - imageFileFormat = bpy.context.scene.input_tool.imageEnum, + enableLogic=bpy.context.scene.input_tool.enableLogic, + enable_Logic_Json=bpy.context.scene.input_tool.enable_Logic_Json, + logicFile=bpy.context.scene.input_tool.logicFile, - enableAnimations = bpy.context.scene.input_tool.animationBool, - animationFileFormat = bpy.context.scene.input_tool.animationEnum, + enableImages=bpy.context.scene.input_tool.imageBool, + imageFileFormat=bpy.context.scene.input_tool.imageEnum, - enableModelsBlender = bpy.context.scene.input_tool.modelBool, - modelFileFormat = bpy.context.scene.input_tool.modelEnum, + enableAnimations=bpy.context.scene.input_tool.animationBool, + animationFileFormat=bpy.context.scene.input_tool.animationEnum, - enableCustomFields = bpy.context.scene.input_tool.enableCustomFields, + enableModelsBlender=bpy.context.scene.input_tool.modelBool, + modelFileFormat=bpy.context.scene.input_tool.modelEnum, - cardanoMetaDataBool = bpy.context.scene.input_tool.cardanoMetaDataBool, - solanaMetaDataBool = bpy.context.scene.input_tool.solanaMetaDataBool, - erc721MetaData = bpy.context.scene.input_tool.erc721MetaData, + enableCustomFields=bpy.context.scene.input_tool.enableCustomFields, - cardano_description = bpy.context.scene.input_tool.cardano_description, - solana_description = bpy.context.scene.input_tool.solana_description, - erc721_description = bpy.context.scene.input_tool.erc721_description, + cardanoMetaDataBool=bpy.context.scene.input_tool.cardanoMetaDataBool, + solanaMetaDataBool=bpy.context.scene.input_tool.solanaMetaDataBool, + erc721MetaData=bpy.context.scene.input_tool.erc721MetaData, - enableMaterials = bpy.context.scene.input_tool.enableMaterials, - materialsFile = bpy.path.abspath(bpy.context.scene.input_tool.materialsFile) + cardano_description=bpy.context.scene.input_tool.cardano_description, + solana_description=bpy.context.scene.input_tool.solana_description, + erc721_description=bpy.context.scene.input_tool.erc721_description, + + enableMaterials=bpy.context.scene.input_tool.enableMaterials, + materialsFile=bpy.path.abspath(bpy.context.scene.input_tool.materialsFile), + + enableAutoShutdown=bpy.context.scene.input_tool.enableAutoShutdown, + + specify_timeBool=bpy.context.scene.input_tool.specify_timeBool, + hours=bpy.context.scene.input_tool.hours, + minutes=bpy.context.scene.input_tool.minutes, + + emailNotificationBool=bpy.context.scene.input_tool.emailNotificationBool, + sender_from=bpy.context.scene.input_tool.sender_from, + email_password=bpy.context.scene.input_tool.email_password, + receiver_to=bpy.context.scene.input_tool.receiver_to, ) return data -@persistent -def Refresh_UI(dummy1, dummy2): - """ - Refreshes the UI upon user interacting with Blender (using depsgraph_update_post handler). Might be a better handler - to use. - """ - global combinations - global recommended_limit - - combinations = (get_combinations.get_combinations()) - recommended_limit = int(round(combinations / 2)) - - # Add panel classes that require refresh to this refresh_panels tuple: - refresh_panel_classes = ( - BMNFTS_PT_CreateData, - ) - - def redraw_panel(refresh_panel_classes): - for i in refresh_panel_classes: - try: - bpy.utils.unregister_class(i) - except: - pass - bpy.utils.register_class(i) - - redraw_panel(refresh_panel_classes) - - -bpy.app.handlers.depsgraph_update_post.append(Refresh_UI) - # ======== Helper functions ======== # def make_directories(save_path): @@ -222,7 +255,7 @@ def runAsHeadless(): """ For use when running from the command line. """ - + # force CUDA device usage with cycles renderer cprefs = bpy.context.preferences.addons['cycles'].preferences cprefs.compute_device_type = 'CUDA' @@ -230,35 +263,35 @@ def runAsHeadless(): print(cprefs.devices.keys()) for key in cprefs.devices.keys(): - cprefs.devices[key].use = True + cprefs.devices[key].use = True print('Using {} devices for rendering!'.format(cprefs.get_num_gpu_devices())) def dumpSettings(settings): output = ( - f"nftName={ settings.nftName }\n" - f"collectionSize={ str(settings.collectionSize) }\n" - f"nftsPerBatch={ str(settings.nftsPerBatch) }\n" - f"save_path={ settings.save_path }\n" - f"enableRarity={ (settings.enableRarity) }\n" - f"enableLogic={ str(settings.enableLogic) }\n" - f"imageBool={ str(settings.imageBool) }\n" - f"imageEnum={ settings.imageEnum }\n" - f"animationBool={ str(settings.animationBool) }\n" - f"animationEnum={ settings.animationEnum }\n" - f"modelBool={ str(settings.modelBool) }\n" - f"modelEnum={ settings.modelEnum }\n" - f"batchToGenerate={ str(settings.batchToGenerate) }\n" - f"cardanoMetaDataBool={ str(settings.cardanoMetaDataBool) }\n" - f"cardano_description={ settings.cardano_description }\n" - f"erc721MetaData={ str(settings.erc721MetaData) }\n" - f"erc721_description={ settings.erc721_description }\n" - f"solanaMetaDataBool={ str(settings.solanaMetaDataBool) }\n" - f"solana_description={ settings.solana_description }\n" - f"enableCustomFields={ str(settings.enableCustomFields) }\n" - f"customfieldsFile={ settings.customfieldsFile }\n" - f"enableMaterials={ str(settings.customfieldsFile) }\n" - f"materialsFile={ settings.materialsFile }\n" + f"nftName={settings.nftName}\n" + f"collectionSize={str(settings.collectionSize)}\n" + f"nftsPerBatch={str(settings.nftsPerBatch)}\n" + f"save_path={settings.save_path}\n" + f"enableRarity={(settings.enableRarity)}\n" + f"enableLogic={str(settings.enableLogic)}\n" + f"imageBool={str(settings.imageBool)}\n" + f"imageEnum={settings.imageEnum}\n" + f"animationBool={str(settings.animationBool)}\n" + f"animationEnum={settings.animationEnum}\n" + f"modelBool={str(settings.modelBool)}\n" + f"modelEnum={settings.modelEnum}\n" + f"batchToGenerate={str(settings.batchToGenerate)}\n" + f"cardanoMetaDataBool={str(settings.cardanoMetaDataBool)}\n" + f"cardano_description={settings.cardano_description}\n" + f"erc721MetaData={str(settings.erc721MetaData)}\n" + f"erc721_description={settings.erc721_description}\n" + f"solanaMetaDataBool={str(settings.solanaMetaDataBool)}\n" + f"solana_description={settings.solana_description}\n" + f"enableCustomFields={str(settings.enableCustomFields)}\n" + f"customfieldsFile={settings.customfieldsFile}\n" + f"enableMaterials={str(settings.customfieldsFile)}\n" + f"materialsFile={settings.materialsFile}\n" ) print(output) @@ -275,31 +308,31 @@ def runAsHeadless(): # print(pairs) - settings.nftName = pairs[0][1] - settings.collectionSize = int(pairs[1][1]) - settings.nftsPerBatch = int(pairs[2][1]) - settings.save_path = pairs[3][1] - settings.enableRarity = pairs[4][1] == 'True' - settings.enableLogic = pairs[5][1] == 'True' - settings.enableLogicJson = pairs[6][1] == 'True' - settings.logicFile = pairs[7][1] - settings.imageBool = pairs[8][1] == 'True' - settings.imageEnum = pairs[9][1] - settings.animationBool = pairs[10][1] == 'True' - settings.animationEnum = pairs[11][1] - settings.modelBool = pairs[12][1] == 'True' - settings.modelEnum = pairs[13][1] - settings.batchToGenerate = int(pairs[14][1]) - settings.cardanoMetaDataBool = pairs[15][1] == 'True' - settings.cardano_description = pairs[16][1] - settings.erc721MetaData = pairs[17][1] == 'True' - settings.erc721_description = pairs[18][1] - settings.solanaMetaDataBool = pairs[19][1] == 'True' - settings.solanaDescription = pairs[20][1] - settings.enableCustomFields = pairs[21][1] == 'True' - settings.customfieldsFile = pairs[22][1] - settings.enableMaterials = pairs[23][1] == 'True' - settings.materialsFile = pairs[24][1] + settings.nftName = pairs[0][1] + settings.collectionSize = int(pairs[1][1]) + settings.nftsPerBatch = int(pairs[2][1]) + settings.save_path = pairs[3][1] + settings.enableRarity = pairs[4][1] == 'True' + settings.enableLogic = pairs[5][1] == 'True' + settings.enableLogicJson = pairs[6][1] == 'True' + settings.logicFile = pairs[7][1] + settings.imageBool = pairs[8][1] == 'True' + settings.imageEnum = pairs[9][1] + settings.animationBool = pairs[10][1] == 'True' + settings.animationEnum = pairs[11][1] + settings.modelBool = pairs[12][1] == 'True' + settings.modelEnum = pairs[13][1] + settings.batchToGenerate = int(pairs[14][1]) + settings.cardanoMetaDataBool = pairs[15][1] == 'True' + settings.cardano_description = pairs[16][1] + settings.erc721MetaData = pairs[17][1] == 'True' + settings.erc721_description = pairs[18][1] + settings.solanaMetaDataBool = pairs[19][1] == 'True' + settings.solanaDescription = pairs[20][1] + settings.enableCustomFields = pairs[21][1] == 'True' + settings.customfieldsFile = pairs[22][1] + settings.enableMaterials = pairs[23][1] == 'True' + settings.materialsFile = pairs[24][1] if args.save_path: settings.save_path = args.save_path @@ -425,10 +458,30 @@ class BMNFTS_PGT_Input_Properties(bpy.types.PropertyGroup): subtype="FILE_PATH" ) + # TODO: Add 'Other' panel inputs to Headless functionality. + # Other Panel: + enableAutoSave: bpy.props.BoolProperty(name="Auto Save Before Generation", + description="Automatically saves your Blender file when 'Generate NFTs & Create Metadata' button is clicked") + + # Auto Shutdown: + enableAutoShutdown: bpy.props.BoolProperty(name="Auto Shutdown", + description="Automatically shuts down your computer after a Batch is finished Generating") + + specify_timeBool: bpy.props.BoolProperty(name="Shutdown in a Given Amount of Time", + description="Wait a given amount of time after a Batch is generated before Automatic Shutdown") + hours: bpy.props.IntProperty(default=0, min=0) + minutes: bpy.props.IntProperty(default=0, min=0) + + # Send Batch Complete Email: + emailNotificationBool: bpy.props.BoolProperty(name="Email Notifications", + description="Receive Email Notifications from Blender once a batch is finished generating") + sender_from: bpy.props.StringProperty(name="From", default="from@example.com") + email_password: bpy.props.StringProperty(name="Password", subtype='PASSWORD') + receiver_to: bpy.props.StringProperty(name="To", default="to@example.com") # API Panel properties: - apiKey: bpy.props.StringProperty(name="API Key", subtype='PASSWORD') + apiKey: bpy.props.StringProperty(name="API Key", subtype='PASSWORD') # Test code for future faetures # ======== Main Operators ======== # @@ -448,10 +501,11 @@ class createData(bpy.types.Operator): if input.enableLogic: if input.enable_Logic_Json and not input.logicFile: - self.report({'ERROR'}, f"No Logic.json file path set. Please set the file path to your Logic.json file.") + self.report({'ERROR'}, + f"No Logic.json file path set. Please set the file path to your Logic.json file.") Intermediate.send_To_Record_JSON(input) - + self.report({'INFO'}, f"NFT Data created!") return {"FINISHED"} @@ -470,10 +524,9 @@ class exportNFTs(bpy.types.Operator): name="Reverse Order") def execute(self, context): - input = getBMNFTData() # Handling Custom Fields UIList input: - + Intermediate.render_and_save_NFTs(input) self.report({'INFO'}, f"All NFTs generated for batch {input.batchToGenerate}!") @@ -498,42 +551,42 @@ class resume_failed_batch(bpy.types.Operator): _fail_state, _failed_batch, _failed_dna, _failed_dna_index = Checks.check_FailedBatches(_batch_json_save_path) - input = BMNFTData ( - nftName = batchData["Generation Save"][-1]["Render_Settings"]["nftName"], - save_path = _save_path, - collectionSize = batchData["Generation Save"][-1]["Render_Settings"]["collectionSize"], + input = BMNFTData( + nftName=batchData["Generation Save"][-1]["Render_Settings"]["nftName"], + save_path=_save_path, + collectionSize=batchData["Generation Save"][-1]["Render_Settings"]["collectionSize"], - Blend_My_NFTs_Output = _Blend_My_NFTs_Output, - batch_json_save_path = _batch_json_save_path, - nftBatch_save_path = batchData["Generation Save"][-1]["Render_Settings"]["nftBatch_save_path"], + Blend_My_NFTs_Output=_Blend_My_NFTs_Output, + batch_json_save_path=_batch_json_save_path, + nftBatch_save_path=batchData["Generation Save"][-1]["Render_Settings"]["nftBatch_save_path"], - enableImages = batchData["Generation Save"][-1]["Render_Settings"]["enableImages"], - imageFileFormat = batchData["Generation Save"][-1]["Render_Settings"]["imageFileFormat"], + enableImages=batchData["Generation Save"][-1]["Render_Settings"]["enableImages"], + imageFileFormat=batchData["Generation Save"][-1]["Render_Settings"]["imageFileFormat"], - enableAnimations = batchData["Generation Save"][-1]["Render_Settings"]["enableAnimations"], - animationFileFormat = batchData["Generation Save"][-1]["Render_Settings"]["animationFileFormat"], + enableAnimations=batchData["Generation Save"][-1]["Render_Settings"]["enableAnimations"], + animationFileFormat=batchData["Generation Save"][-1]["Render_Settings"]["animationFileFormat"], - enableModelsBlender = batchData["Generation Save"][-1]["Render_Settings"]["enableModelsBlender"], - modelFileFormat = batchData["Generation Save"][-1]["Render_Settings"]["modelFileFormat"], + enableModelsBlender=batchData["Generation Save"][-1]["Render_Settings"]["enableModelsBlender"], + modelFileFormat=batchData["Generation Save"][-1]["Render_Settings"]["modelFileFormat"], - enableCustomFields = batchData["Generation Save"][-1]["Render_Settings"]["enableCustomFields"], - custom_Fields = batchData["Generation Save"][-1]["Render_Settings"]["custom_Fields"], + enableCustomFields=batchData["Generation Save"][-1]["Render_Settings"]["enableCustomFields"], + custom_Fields=batchData["Generation Save"][-1]["Render_Settings"]["custom_Fields"], - cardanoMetaDataBool = batchData["Generation Save"][-1]["Render_Settings"]["cardanoMetaDataBool"], - solanaMetaDataBool = batchData["Generation Save"][-1]["Render_Settings"]["solanaMetaDataBool"], - erc721MetaData = batchData["Generation Save"][-1]["Render_Settings"]["erc721MetaData"], + cardanoMetaDataBool=batchData["Generation Save"][-1]["Render_Settings"]["cardanoMetaDataBool"], + solanaMetaDataBool=batchData["Generation Save"][-1]["Render_Settings"]["solanaMetaDataBool"], + erc721MetaData=batchData["Generation Save"][-1]["Render_Settings"]["erc721MetaData"], - cardano_description = batchData["Generation Save"][-1]["Render_Settings"]["cardano_description"], - solana_description = batchData["Generation Save"][-1]["Render_Settings"]["solana_description"], - erc721_description = batchData["Generation Save"][-1]["Render_Settings"]["erc721_description"], + cardano_description=batchData["Generation Save"][-1]["Render_Settings"]["cardano_description"], + solana_description=batchData["Generation Save"][-1]["Render_Settings"]["solana_description"], + erc721_description=batchData["Generation Save"][-1]["Render_Settings"]["erc721_description"], - enableMaterials = batchData["Generation Save"][-1]["Render_Settings"]["enableMaterials"], - materialsFile = batchData["Generation Save"][-1]["Render_Settings"]["materialsFile"], + enableMaterials=batchData["Generation Save"][-1]["Render_Settings"]["enableMaterials"], + materialsFile=batchData["Generation Save"][-1]["Render_Settings"]["materialsFile"], - fail_state = _fail_state, - failed_batch = _failed_batch, - failed_dna = _failed_dna, - failed_dna_index = _failed_dna_index + fail_state=_fail_state, + failed_batch=_failed_batch, + failed_dna=_failed_dna, + failed_dna_index=_failed_dna_index ) Exporter.render_and_save_NFTs(input) @@ -582,51 +635,51 @@ class export_settings(bpy.types.Operator): "#when running Blend_My_NFTs in a headless environment.\n" "\n" "#The name of your nft project\n" - f"nftName={ settings.nftName }\n" + f"nftName={settings.nftName}\n" "\n" "#NFT Collection Size\n" - f"collectionSize={ settings.collectionSize }\n" + f"collectionSize={settings.collectionSize}\n" "\n" "#The number of NFTs to generate per batch\n" - f"nftsPerBatch={ str(settings.nftsPerBatch) }\n" + f"nftsPerBatch={str(settings.nftsPerBatch)}\n" "\n" "#Save path for your NFT files\n" - f"save_path={ settings.save_path }\n" + f"save_path={settings.save_path}\n" "\n" "#Enable Rarity\n" - f"enableRarity={ (settings.enableRarity) }\n" + f"enableRarity={(settings.enableRarity)}\n" "\n" "#Enable Logic\n" - f"enableLogic={ str(settings.enableLogic) }\n" - f"enableLogicJson={ str(settings.enable_Logic_Json) }\n" - f"logicFilePath={ settings.logicFile }\n" + f"enableLogic={str(settings.enableLogic)}\n" + f"enableLogicJson={str(settings.enable_Logic_Json)}\n" + f"logicFilePath={settings.logicFile}\n" "\n" "#NFT Media output type(s):\n" - f"imageBool={ str(settings.imageBool) }\n" - f"imageEnum={ settings.imageEnum }\n" - f"animationBool={ str(settings.animationBool) }\n" - f"animationEnum={ settings.animationEnum }\n" - f"modelBool={ str(settings.modelBool) }\n" - f"modelEnum={ settings.modelEnum }\n" + f"imageBool={str(settings.imageBool)}\n" + f"imageEnum={settings.imageEnum}\n" + f"animationBool={str(settings.animationBool)}\n" + f"animationEnum={settings.animationEnum}\n" + f"modelBool={str(settings.modelBool)}\n" + f"modelEnum={settings.modelEnum}\n" "\n" "#Batch to generate\n" - f"batchToGenerate={ str(settings.batchToGenerate) }\n" + f"batchToGenerate={str(settings.batchToGenerate)}\n" "\n" "#Metadata Format\n" - f"cardanoMetaDataBool={ str(settings.cardanoMetaDataBool) }\n" - f"cardano_description={ settings.cardano_description }\n" - f"erc721MetaData={ str(settings.erc721MetaData) }\n" - f"erc721_description={ settings.erc721_description }\n" - f"solanaMetaDataBool={ str(settings.solanaMetaDataBool) }\n" - f"solana_description={ settings.solana_description }\n" + f"cardanoMetaDataBool={str(settings.cardanoMetaDataBool)}\n" + f"cardano_description={settings.cardano_description}\n" + f"erc721MetaData={str(settings.erc721MetaData)}\n" + f"erc721_description={settings.erc721_description}\n" + f"solanaMetaDataBool={str(settings.solanaMetaDataBool)}\n" + f"solana_description={settings.solana_description}\n" "\n" "#Enable Custom Fields\n" - f"enableCustomFields={ str(settings.enableCustomFields) }\n" - f"customfieldsFile={ settings.customfieldsFile }\n" + f"enableCustomFields={str(settings.enableCustomFields)}\n" + f"customfieldsFile={settings.customfieldsFile}\n" "\n" "#Enable Materials\n" - f"enableMaterials={ str(settings.enableMaterials) }\n" - f"materialsFile={ settings.materialsFile }\n" + f"enableMaterials={str(settings.enableMaterials)}\n" + f"materialsFile={settings.materialsFile}\n" ) print(output, file=config) @@ -788,7 +841,8 @@ class BMNFTS_PT_GenerateNFTs(bpy.types.Panel): rows = 2 row = layout.row() - row.template_list("CUSTOM_UL_custom_metadata_fields_items", "", scn, "custom_metadata_fields", scn, "custom_metadata_fields_index", rows=rows) + row.template_list("CUSTOM_UL_custom_metadata_fields_items", "", scn, "custom_metadata_fields", scn, + "custom_metadata_fields_index", rows=rows) col = row.column(align=True) col.operator("custom_metadata_fields_uilist.list_action", icon='ZOOM_IN', text="").action = 'ADD' @@ -843,7 +897,6 @@ class BMNFTS_PT_Refactor(bpy.types.Panel): layout.label(text="Ensure all batches have been created before refactoring.") layout.label(text="Refactoring combines all batches into one easy to manage folder.") - row = layout.row() self.layout.operator("refactor.batches", icon='FOLDER_REDIRECT', text="Refactor Batches") @@ -861,10 +914,59 @@ class BMNFTS_PT_Other(bpy.types.Panel): input_tool_scene = scene.input_tool """ + Other: + A place to store miscellaneous settings, features, and external links that the user may find useful but doesn't + want to get in the way of their work flow. + Export Settings: This panel gives the user the option to export all settings from the Blend_My_NFTs addon into a config file. Settings will be read from the config file when running heedlessly. """ + + row = layout.row() + row.prop(input_tool_scene, "enableAutoSave") + + # Auto Shutdown: + row = layout.row() + row.prop(input_tool_scene, "enableAutoShutdown") + row.label(text="*Must Run Blender as Admin") + + if bpy.context.scene.input_tool.enableAutoShutdown: + row = layout.row() + row.prop(input_tool_scene, "specify_timeBool") + + time_row1 = layout.row() + time_row1.label(text=f"Hours") + time_row1.prop(input_tool_scene, "hours", text="") + + time_row2 = layout.row() + time_row2.label(text=f"Minutes") + time_row2.prop(input_tool_scene, "minutes", text="") + + if not bpy.context.scene.input_tool.specify_timeBool: + time_row1.enabled = False + time_row2.enabled = False + else: + time_row1.enabled = True + time_row2.enabled = True + layout.separator() + + row = layout.row() + row.prop(input_tool_scene, "emailNotificationBool") + row.label(text="*Windows 10+ only") + + if bpy.context.scene.input_tool.emailNotificationBool: + row = layout.row() + row.prop(input_tool_scene, "sender_from") + row = layout.row() + row.prop(input_tool_scene, "email_password") + + layout.separator() + row = layout.row() + row.prop(input_tool_scene, "receiver_to") + + layout.separator() + layout.label(text=f"Running Blend_My_NFTs Headless:") save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path) @@ -898,22 +1000,22 @@ class BMNFTS_PT_Other(bpy.types.Panel): # ======== Blender add-on register/unregister handling ======== # classes = ( - # Property Group Classes: - BMNFTS_PGT_Input_Properties, + # Property Group Classes: + BMNFTS_PGT_Input_Properties, - # Operator Classes: - createData, - exportNFTs, - resume_failed_batch, - refactor_Batches, - export_settings, + # Operator Classes: + createData, + exportNFTs, + resume_failed_batch, + refactor_Batches, + export_settings, - # Panel Classes: - BMNFTS_PT_CreateData, - BMNFTS_PT_GenerateNFTs, - BMNFTS_PT_Refactor, - BMNFTS_PT_Other, -) + Custom_Metadata_UIList.classes_Custom_Metadata_UIList + Logic_UIList.classes_Logic_UIList + # Panel Classes: + BMNFTS_PT_CreateData, + BMNFTS_PT_GenerateNFTs, + BMNFTS_PT_Refactor, + BMNFTS_PT_Other, + ) + Custom_Metadata_UIList.classes_Custom_Metadata_UIList + Logic_UIList.classes_Logic_UIList def register(): @@ -922,12 +1024,14 @@ def register(): bpy.types.Scene.input_tool = bpy.props.PointerProperty(type=BMNFTS_PGT_Input_Properties) - bpy.types.Scene.custom_metadata_fields = CollectionProperty(type=Custom_Metadata_UIList.CUSTOM_custom_metadata_fields_objectCollection) + bpy.types.Scene.custom_metadata_fields = CollectionProperty( + type=Custom_Metadata_UIList.CUSTOM_custom_metadata_fields_objectCollection) bpy.types.Scene.custom_metadata_fields_index = IntProperty() bpy.types.Scene.logic_fields = CollectionProperty(type=Logic_UIList.CUSTOM_logic_objectCollection) bpy.types.Scene.logic_fields_index = IntProperty() + def unregister(): for cls in reversed(classes): bpy.utils.unregister_class(cls) diff --git a/main/Constants.py b/main/Constants.py index a3dd05a..3cc94c3 100644 --- a/main/Constants.py +++ b/main/Constants.py @@ -2,7 +2,8 @@ # This file is for storing or updating constant values that may need to be changes depending on system requirements and # different usecases. import os - +import json +import platform removeList = [".gitignore", ".DS_Store", "desktop.ini", ".ini"] @@ -34,3 +35,17 @@ class bcolors: ERROR = '\033[91m' # RED RESET = '\033[0m' # RESET COLOR +def save_result(result): + """ + Saves json result to json file at the specified path. + """ + file_name = "log.json" + if platform.system() == "Linux" or platform.system() == "Darwin": + path = os.path.join(os.path.join(os.path.expanduser('~')), 'Desktop', file_name) + + if platform.system() == "Windows": + path = os.path.join(os.environ["HOMEPATH"], "Desktop", file_name) + + data = json.dumps(result, indent=1, ensure_ascii=True) + with open(path, 'w') as outfile: + outfile.write(data + '\n') diff --git a/main/DNA_Generator.py b/main/DNA_Generator.py index a455743..d045e2f 100644 --- a/main/DNA_Generator.py +++ b/main/DNA_Generator.py @@ -15,375 +15,354 @@ from .Constants import bcolors, removeList, remove_file_by_extension def get_hierarchy(): - """ - Returns the hierarchy of a given Blender scene. - """ + """ + Returns the hierarchy of a given Blender scene. + """ - coll = bpy.context.scene.collection + coll = bpy.context.scene.collection - scriptIgnoreCollection = bpy.data.collections["Script_Ignore"] + scriptIgnoreCollection = bpy.data.collections["Script_Ignore"] - listAllCollInScene = [] - listAllCollections = [] + listAllCollInScene = [] + listAllCollections = [] - def traverse_tree(t): - yield t - for child in t.children: - yield from traverse_tree(child) + def traverse_tree(t): + yield t + for child in t.children: + yield from traverse_tree(child) - for c in traverse_tree(coll): - listAllCollInScene.append(c) + for c in traverse_tree(coll): + listAllCollInScene.append(c) - for i in listAllCollInScene: - listAllCollections.append(i.name) + for i in listAllCollInScene: + listAllCollections.append(i.name) - listAllCollections.remove(scriptIgnoreCollection.name) + listAllCollections.remove(scriptIgnoreCollection.name) - if "Scene Collection" in listAllCollections: - listAllCollections.remove("Scene Collection") + if "Scene Collection" in listAllCollections: + listAllCollections.remove("Scene Collection") - if "Master Collection" in listAllCollections: - listAllCollections.remove("Master Collection") + if "Master Collection" in listAllCollections: + listAllCollections.remove("Master Collection") - def allScriptIgnore(scriptIgnoreCollection): - # Removes all collections, sub collections in Script_Ignore collection from listAllCollections. + def allScriptIgnore(scriptIgnoreCollection): + # Removes all collections, sub collections in Script_Ignore collection from listAllCollections. - for coll in list(scriptIgnoreCollection.children): - listAllCollections.remove(coll.name) - listColl = list(coll.children) - if len(listColl) > 0: - allScriptIgnore(coll) + for coll in list(scriptIgnoreCollection.children): + listAllCollections.remove(coll.name) + listColl = list(coll.children) + if len(listColl) > 0: + allScriptIgnore(coll) - allScriptIgnore(scriptIgnoreCollection) - listAllCollections.sort() + allScriptIgnore(scriptIgnoreCollection) + listAllCollections.sort() - exclude = ["_"] # Excluding characters that identify a Variant - attributeCollections = copy.deepcopy(listAllCollections) + exclude = ["_"] # Excluding characters that identify a Variant + attributeCollections = copy.deepcopy(listAllCollections) - def filter_num(): - """ - This function removes items from 'attributeCollections' if they include values from the 'exclude' variable. - It removes child collections from the parent collections in from the "listAllCollections" list. - """ - for x in attributeCollections: - if any(a in x for a in exclude): - attributeCollections.remove(x) + def filter_num(): + """ + This function removes items from 'attributeCollections' if they include values from the 'exclude' variable. + It removes child collections from the parent collections in from the "listAllCollections" list. + """ + for x in attributeCollections: + if any(a in x for a in exclude): + attributeCollections.remove(x) - for i in range(len(listAllCollections)): - filter_num() + for i in range(len(listAllCollections)): + filter_num() - attributeVariants = [x for x in listAllCollections if x not in attributeCollections] - attributeCollections1 = copy.deepcopy(attributeCollections) + attributeVariants = [x for x in listAllCollections if x not in attributeCollections] + attributeCollections1 = copy.deepcopy(attributeCollections) - def attributeData(attributeVariants): - """ + def attributeData(attributeVariants): + """ Creates a dictionary of each attribute """ - allAttDataList = {} - for i in attributeVariants: - # Check if name follows naming conventions: - if i.count("_") > 2: - raise Exception( - f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" - f"There is a naming issue with the following Attribute/Variant: '{i}'\n" - f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}" - f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" - ) + allAttDataList = {} + for i in attributeVariants: + # Check if name follows naming conventions: + if int(i.count("_")) > 2 and int(i.split("_")[1]) > 0: + raise Exception( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"There is a naming issue with the following Attribute/Variant: '{i}'\n" + f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) - def getName(i): - """ - Returns the name of "i" attribute variant - """ + try: + number = i.split("_")[1] + name = i.split("_")[0] + rarity = i.split("_")[2] + except IndexError: + raise Exception( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"There is a naming issue with the following Attribute/Variant: '{i}'\n" + f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) - name = i.split("_")[0] + allAttDataList[i] = {"name": name, "number": number, "rarity": rarity} + return allAttDataList - return name + variantMetaData = attributeData(attributeVariants) - def getOrder_rarity(i): - """ - Returns the "order" and "rarity" (if enabled) of i attribute variant in a list - """ - x = re.sub(r'[a-zA-Z]', "", i) - a = x.split("_") - del a[0] - return list(a) + hierarchy = {} + for i in attributeCollections1: + colParLong = list(bpy.data.collections[str(i)].children) + colParShort = {} + for x in colParLong: + colParShort[x.name] = None + hierarchy[i] = colParShort - name = getName(i) - orderRarity = getOrder_rarity(i) + for a in hierarchy: + for b in hierarchy[a]: + for x in variantMetaData: + if str(x) == str(b): + (hierarchy[a])[b] = variantMetaData[x] - try: - number = orderRarity[0] - except: - raise Exception( - f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" - f"There is a naming issue with the following Attribute/Variant: '{i}'\n" - f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}" - f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" - ) + return hierarchy - try: - rarity = orderRarity[1] - except: - raise Exception( - f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" - f"There is a naming issue with the following Attribute/Variant: '{i}'\n" - f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}" - f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" - ) - - eachObject = {"name": name, "number": number, "rarity": rarity} - allAttDataList[i] = eachObject - return allAttDataList - - variantMetaData = attributeData(attributeVariants) - - hierarchy = {} - for i in attributeCollections1: - colParLong = list(bpy.data.collections[str(i)].children) - colParShort = {} - for x in colParLong: - colParShort[x.name] = None - hierarchy[i] = colParShort - - for a in hierarchy: - for b in hierarchy[a]: - for x in variantMetaData: - if str(x) == str(b): - (hierarchy[a])[b] = variantMetaData[x] - - return hierarchy def generateNFT_DNA(collectionSize, enableRarity, enableLogic, logicFile, enableMaterials, materialsFile): - """ + """ Returns batchDataDictionary containing the number of NFT combinations, hierarchy, and the DNAList. """ - hierarchy = get_hierarchy() + hierarchy = get_hierarchy() - # DNA random, Rarity and Logic methods: - DataDictionary = {} + # DNA random, Rarity and Logic methods: + DataDictionary = {} - def createDNArandom(): - """Creates a single DNA randomly without Rarity or Logic.""" - dnaStr = "" - dnaStrList = [] - listOptionVariant = [] + def createDNArandom(hierarchy): + """Creates a single DNA randomly without Rarity or Logic.""" + dnaStr = "" + dnaStrList = [] + listOptionVariant = [] - for i in hierarchy: - numChild = len(hierarchy[i]) - possibleNums = list(range(1, numChild + 1)) - listOptionVariant.append(possibleNums) + for i in hierarchy: + numChild = len(hierarchy[i]) + possibleNums = list(range(1, numChild + 1)) + listOptionVariant.append(possibleNums) - for i in listOptionVariant: - randomVariantNum = random.choices(i, k=1) - str1 = ''.join(str(e) for e in randomVariantNum) - dnaStrList.append(str1) + for i in listOptionVariant: + randomVariantNum = random.choices(i, k=1) + str1 = ''.join(str(e) for e in randomVariantNum) + dnaStrList.append(str1) - for i in dnaStrList: - num = "-" + str(i) - dnaStr += num + for i in dnaStrList: + num = "-" + str(i) + dnaStr += num - dna = ''.join(dnaStr.split('-', 1)) + dna = ''.join(dnaStr.split('-', 1)) - return str(dna) + return str(dna) - def singleCompleteDNA(): - """This function applies Rarity and Logic to a single DNA created by createDNASingle() if Rarity or Logic specified""" - singleDNA = "" - # Comments for debugging random, rarity, logic, and materials. - if not enableRarity: - singleDNA = createDNArandom() - # print("============") - if enableRarity: - singleDNA = Rarity.createDNArarity(hierarchy) - # print(f"Rarity DNA: {singleDNA}") + def singleCompleteDNA(): + """ + This function applies Rarity and Logic to a single DNA created by createDNASingle() if Rarity or Logic specified + """ - if enableLogic: - singleDNA = Logic.logicafyDNAsingle(hierarchy, singleDNA, logicFile) - # print(f"Original DNA: {singleDNA}") - # print("============\n") + singleDNA = "" + # Comments for debugging random, rarity, logic, and materials. + if not enableRarity: + singleDNA = createDNArandom(hierarchy) + # print("============") + # print(f"Original DNA: {singleDNA}") + if enableRarity: + singleDNA = Rarity.createDNArarity(hierarchy) + # print(f"Rarity DNA: {singleDNA}") - if enableMaterials: - singleDNA = Material_Generator.apply_materials(hierarchy, singleDNA, materialsFile) - # print(f"Materials DNA: {singleDNA}") - # print("============\n") + if enableLogic: + singleDNA = Logic.logicafyDNAsingle(hierarchy, singleDNA, logicFile, enableRarity, enableMaterials) + # print(f"Logic DNA: {singleDNA}") - return singleDNA + if enableMaterials: + singleDNA = Material_Generator.apply_materials(hierarchy, singleDNA, materialsFile, enableRarity) + # print(f"Materials DNA: {singleDNA}") - def create_DNAList(): - """Creates DNAList. Loops through createDNARandom() and applies Rarity, and Logic while checking if all DNA are unique""" - DNASetReturn = set() + # print("============\n") - for i in range(collectionSize): - dnaPushToList = partial(singleCompleteDNA) + return singleDNA - DNASetReturn |= {''.join([dnaPushToList()]) for _ in range(collectionSize - len(DNASetReturn))} + def create_DNAList(): + """Creates DNAList. Loops through createDNARandom() and applies Rarity, and Logic while checking if all DNA are unique""" + DNASetReturn = set() - DNAListUnformatted = list(DNASetReturn) + for i in range(collectionSize): + dnaPushToList = partial(singleCompleteDNA) - DNAListFormatted = [] - DNA_Counter = 1 - for i in DNAListUnformatted: - DNAListFormatted.append({ - i: { - "Complete": False, - "Order_Num": DNA_Counter - } - }) + DNASetReturn |= {''.join([dnaPushToList()]) for _ in range(collectionSize - len(DNASetReturn))} - DNA_Counter += 1 + DNAListUnformatted = list(DNASetReturn) - return DNAListFormatted + DNAListFormatted = [] + DNA_Counter = 1 + for i in DNAListUnformatted: + DNAListFormatted.append({ + i: { + "Complete": False, + "Order_Num": DNA_Counter + } + }) - DNAList = create_DNAList() + DNA_Counter += 1 - # Messages: + return DNAListFormatted - Checks.raise_Warning_collectionSize(DNAList, collectionSize) + DNAList = create_DNAList() - # Data stored in batchDataDictionary: - DataDictionary["numNFTsGenerated"] = len(DNAList) - DataDictionary["hierarchy"] = hierarchy - DataDictionary["DNAList"] = DNAList + # Messages: + + Checks.raise_Warning_collectionSize(DNAList, collectionSize) + + # Data stored in batchDataDictionary: + DataDictionary["numNFTsGenerated"] = len(DNAList) + DataDictionary["hierarchy"] = hierarchy + DataDictionary["DNAList"] = DNAList + + return DataDictionary - return DataDictionary def makeBatches(collectionSize, nftsPerBatch, save_path, batch_json_save_path): - """ + """ Sorts through all the batches and outputs a given number of batches depending on collectionSize and nftsPerBatch. These files are then saved as Batch#.json files to batch_json_save_path """ - # Clears the Batch Data folder of Batches: - batchList = os.listdir(batch_json_save_path) - if batchList: - for i in batchList: - batch = os.path.join(batch_json_save_path, i) - if os.path.exists(batch): - os.remove( - os.path.join(batch_json_save_path, i) - ) + # Clears the Batch Data folder of Batches: + batchList = os.listdir(batch_json_save_path) + if batchList: + for i in batchList: + batch = os.path.join(batch_json_save_path, i) + if os.path.exists(batch): + os.remove( + os.path.join(batch_json_save_path, i) + ) - Blend_My_NFTs_Output = os.path.join(save_path, "Blend_My_NFTs Output", "NFT_Data") - NFTRecord_save_path = os.path.join(Blend_My_NFTs_Output, "NFTRecord.json") - DataDictionary = json.load(open(NFTRecord_save_path)) + Blend_My_NFTs_Output = os.path.join(save_path, "Blend_My_NFTs Output", "NFT_Data") + NFTRecord_save_path = os.path.join(Blend_My_NFTs_Output, "NFTRecord.json") + DataDictionary = json.load(open(NFTRecord_save_path)) - numNFTsGenerated = DataDictionary["numNFTsGenerated"] - hierarchy = DataDictionary["hierarchy"] - DNAList = DataDictionary["DNAList"] + numNFTsGenerated = DataDictionary["numNFTsGenerated"] + hierarchy = DataDictionary["hierarchy"] + DNAList = DataDictionary["DNAList"] - numBatches = collectionSize // nftsPerBatch - remainder_dna = collectionSize % nftsPerBatch - if remainder_dna > 0: - numBatches += 1 + numBatches = collectionSize // nftsPerBatch + remainder_dna = collectionSize % nftsPerBatch + if remainder_dna > 0: + numBatches += 1 - print(f"To generate batches of {nftsPerBatch} DNA sequences per batch, with a total of {numNFTsGenerated}" - f" possible NFT DNA sequences, the number of batches generated will be {numBatches}") + print(f"To generate batches of {nftsPerBatch} DNA sequences per batch, with a total of {numNFTsGenerated}" + f" possible NFT DNA sequences, the number of batches generated will be {numBatches}") - batches_dna_list = [] + batches_dna_list = [] - for i in range(numBatches): - BatchDNAList = [] - if i != range(numBatches)[-1]: - BatchDNAList = list(DNAList[0:nftsPerBatch]) - batches_dna_list.append(BatchDNAList) + for i in range(numBatches): + BatchDNAList = [] + if i != range(numBatches)[-1]: + BatchDNAList = list(DNAList[0:nftsPerBatch]) + batches_dna_list.append(BatchDNAList) - DNAList = [x for x in DNAList if x not in BatchDNAList] - else: - BatchDNAList = DNAList + DNAList = [x for x in DNAList if x not in BatchDNAList] + else: + BatchDNAList = DNAList - batchDictionary = { - "NFTs_in_Batch": int(len(BatchDNAList)), - "hierarchy": hierarchy, - "BatchDNAList": BatchDNAList - } + batchDictionary = { + "NFTs_in_Batch": int(len(BatchDNAList)), + "hierarchy": hierarchy, + "BatchDNAList": BatchDNAList + } - batchDictionary = json.dumps(batchDictionary, indent=1, ensure_ascii=True) + batchDictionary = json.dumps(batchDictionary, indent=1, ensure_ascii=True) + + with open(os.path.join(batch_json_save_path, f"Batch{i + 1}.json"), "w") as outfile: + outfile.write(batchDictionary) - with open(os.path.join(batch_json_save_path, f"Batch{i + 1}.json"), "w") as outfile: - outfile.write(batchDictionary) def send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, enableMaterials, materialsFile, Blend_My_NFTs_Output, batch_json_save_path): - """ + """ Creates NFTRecord.json file and sends "batchDataDictionary" to it. NFTRecord.json is a permanent record of all DNA you've generated with all attribute variants. If you add new variants or attributes to your .blend file, other scripts need to reference this .json file to generate new DNA and make note of the new attributes and variants to prevent repeate DNA. """ - # Checking Scene is compatible with BMNFTs: - Checks.check_Scene() + # Checking Scene is compatible with BMNFTs: + Checks.check_Scene() - # Messages: - print( - f"\n========================================\n" - f"Creating NFT Data. Generating {collectionSize} NFT DNA.\n" - ) + # Messages: + print( + f"\n========================================\n" + f"Creating NFT Data. Generating {collectionSize} NFT DNA.\n" + ) - if not enableRarity and not enableLogic: - print(f"{bcolors.OK}NFT DNA will be determined randomly, no special properties or parameters are applied.\n{bcolors.RESET}") + if not enableRarity and not enableLogic: + print( + f"{bcolors.OK}NFT DNA will be determined randomly, no special properties or parameters are applied.\n{bcolors.RESET}") - if enableRarity: - print(f"{bcolors.OK}Rarity is ON. Weights listed in .blend scene will be taken into account.\n{bcolors.RESET}") + if enableRarity: + print(f"{bcolors.OK}Rarity is ON. Weights listed in .blend scene will be taken into account.\n{bcolors.RESET}") - if enableLogic: - print(f"{bcolors.OK}Logic is ON. Rules listed in {logicFile} will be taken into account.\n{bcolors.RESET}") + if enableLogic: + print(f"{bcolors.OK}Logic is ON. {len(list(logicFile.keys()))} rules detected and applied.\n{bcolors.RESET}") - time_start = time.time() + time_start = time.time() - def create_nft_data(): - try: - DataDictionary = generateNFT_DNA(collectionSize, enableRarity, enableLogic, logicFile, enableMaterials, - materialsFile) - NFTRecord_save_path = os.path.join(Blend_My_NFTs_Output, "NFTRecord.json") + def create_nft_data(): + try: + DataDictionary = generateNFT_DNA(collectionSize, enableRarity, enableLogic, logicFile, enableMaterials, + materialsFile) + NFTRecord_save_path = os.path.join(Blend_My_NFTs_Output, "NFTRecord.json") - # Checks: + # Checks: - Checks.raise_Warning_maxNFTs(nftsPerBatch, collectionSize) - Checks.check_Duplicates(DataDictionary["DNAList"]) - Checks.raise_Error_ZeroCombinations() + Checks.raise_Warning_maxNFTs(nftsPerBatch, collectionSize) + Checks.check_Duplicates(DataDictionary["DNAList"]) + Checks.raise_Error_ZeroCombinations() - if enableRarity: - Checks.check_Rarity(DataDictionary["hierarchy"], DataDictionary["DNAList"], os.path.join(save_path, "Blend_My_NFTs Output/NFT_Data")) + if enableRarity: + Checks.check_Rarity(DataDictionary["hierarchy"], DataDictionary["DNAList"], + os.path.join(save_path, "Blend_My_NFTs Output/NFT_Data")) - except FileNotFoundError: - raise FileNotFoundError( - f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" - f"Data not saved to NFTRecord.json. Please review your Blender scene and ensure it follows " - f"the naming conventions and scene structure. For more information, " - f"see:\n{bcolors.RESET}" - f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" - ) - finally: - loading.stop() + except FileNotFoundError: + raise FileNotFoundError( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"Data not saved to NFTRecord.json. Please review your Blender scene and ensure it follows " + f"the naming conventions and scene structure. For more information, " + f"see:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) + finally: + loading.stop() - try: - ledger = json.dumps(DataDictionary, indent=1, ensure_ascii=True) - with open(NFTRecord_save_path, 'w') as outfile: - outfile.write(ledger + '\n') + try: + ledger = json.dumps(DataDictionary, indent=1, ensure_ascii=True) + with open(NFTRecord_save_path, 'w') as outfile: + outfile.write(ledger + '\n') - print( - f"\n{bcolors.OK}Blend_My_NFTs Success:\n" - f"{len(DataDictionary['DNAList'])} NFT DNA saved to {NFTRecord_save_path}. NFT DNA Successfully created.\n{bcolors.RESET}") + print( + f"\n{bcolors.OK}Blend_My_NFTs Success:\n" + f"{len(DataDictionary['DNAList'])} NFT DNA saved to {NFTRecord_save_path}. NFT DNA Successfully created.\n{bcolors.RESET}") - except: - raise ( - f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" - f"Data not saved to NFTRecord.json. Please review your Blender scene and ensure it follows " - f"the naming conventions and scene structure. For more information, " - f"see:\n{bcolors.RESET}" - f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" - ) + except: + raise ( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"Data not saved to NFTRecord.json. Please review your Blender scene and ensure it follows " + f"the naming conventions and scene structure. For more information, " + f"see:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) - # Loading Animation: - loading = Loader(f'Creating NFT DNA...', '').start() - create_nft_data() - makeBatches(collectionSize, nftsPerBatch, save_path, batch_json_save_path) - loading.stop() + # Loading Animation: + loading = Loader(f'Creating NFT DNA...', '').start() + create_nft_data() + makeBatches(collectionSize, nftsPerBatch, save_path, batch_json_save_path) + loading.stop() - time_end = time.time() + time_end = time.time() - print( - f"{bcolors.OK}Created and saved NFT DNA in {time_end - time_start}s.\n{bcolors.RESET}" - ) + print( + f"{bcolors.OK}Created and saved NFT DNA in {time_end - time_start}s.\n{bcolors.RESET}" + ) diff --git a/main/Exporter.py b/main/Exporter.py index 51f86fe..dea61bf 100644 --- a/main/Exporter.py +++ b/main/Exporter.py @@ -4,9 +4,12 @@ import bpy import os +import ssl import time import json +import smtplib import datetime +import platform from .loading_animation import Loader from .Constants import bcolors, removeList, remove_file_by_extension from .Metadata import createCardanoMetadata, createSolanaMetaData, createErc721MetaData @@ -41,37 +44,37 @@ def save_generation_state(input): "Generation Start Date and Time": [CURRENT_TIME, CURRENT_DATE, LOCAL_TIMEZONE], "Render_Settings": { - "nftName": input.nftName, - "save_path": input.save_path, - "batchToGenerate": input.batchToGenerate, - "collectionSize": input.collectionSize, + "nftName": input.nftName, + "save_path": input.save_path, + "batchToGenerate": input.batchToGenerate, + "collectionSize": input.collectionSize, "Blend_My_NFTs_Output": input.Blend_My_NFTs_Output, "batch_json_save_path": input.batch_json_save_path, - "nftBatch_save_path": input.nftBatch_save_path, + "nftBatch_save_path": input.nftBatch_save_path, - "enableImages": input.enableImages, - "imageFileFormat": input.imageFileFormat, + "enableImages": input.enableImages, + "imageFileFormat": input.imageFileFormat, - "enableAnimations": input.enableAnimations, - "animationFileFormat": input.animationFileFormat, + "enableAnimations": input.enableAnimations, + "animationFileFormat": input.animationFileFormat, - "enableModelsBlender": input.enableModelsBlender, - "modelFileFormat": input.modelFileFormat, + "enableModelsBlender": input.enableModelsBlender, + "modelFileFormat": input.modelFileFormat, - "enableCustomFields": input.enableCustomFields, - "custom_Fields": input.custom_Fields, + "enableCustomFields": input.enableCustomFields, + "custom_Fields": input.custom_Fields, - "cardanoMetaDataBool": input.cardanoMetaDataBool, - "solanaMetaDataBool": input.solanaMetaDataBool, - "erc721MetaData": input.erc721MetaData, + "cardanoMetaDataBool": input.cardanoMetaDataBool, + "solanaMetaDataBool": input.solanaMetaDataBool, + "erc721MetaData": input.erc721MetaData, - "cardano_description": input.cardano_description, - "solana_description": input.solana_description, - "erc721_description": input.erc721_description, + "cardano_description": input.cardano_description, + "solana_description": input.solana_description, + "erc721_description": input.erc721_description, - "enableMaterials": input.enableMaterials, - "materialsFile": input.materialsFile, + "enableMaterials": input.enableMaterials, + "materialsFile": input.materialsFile, }, }) @@ -112,18 +115,20 @@ def render_and_save_NFTs(input): Renders the NFT DNA in a Batch#.json, where # is renderBatch in config.py. Turns off the viewport camera and the render camera for all items in hierarchy. """ - print(f"\nFAILED BATCH = {input.failed_batch}\n") - print(f"\nBATCH TO GENERATE = {input.batchToGenerate}\n") time_start_1 = time.time() + # If failed Batch is detected and user is resuming its generation: if input.fail_state: + print(f"{bcolors.ERROR}\nResuming Failed Batch {input.failed_batch}\n{bcolors.RESET}") NFTs_in_Batch, hierarchy, BatchDNAList = getBatchData(input.failed_batch, input.batch_json_save_path) for a in range(input.failed_dna): del BatchDNAList[0] x = input.failed_dna + 1 + # If user is generating the normal way: else: + print(f"\nGenerating Batch {input.batchToGenerate}\n") NFTs_in_Batch, hierarchy, BatchDNAList = getBatchData(input.batchToGenerate, input.batch_json_save_path) save_generation_state(input) x = 1 @@ -177,12 +182,14 @@ def render_and_save_NFTs(input): if hierarchy[attribute][var]['number'] == variant: variant = var - if material != '0': + if material != '0': # If material is not empty for variant_m in materialsFile: if variant == variant_m: - for mat in materialsFile[variant_m]["Material List"]: - if mat.split('_')[1] == material: - material = mat + # Getting Materials name from Materials index in the Materials List + materials_list = list(materialsFile[variant_m]["Material List"].keys()) + + material = materials_list[int(material) - 1] # Subtract 1 because '0' means empty mat + break full_dna_dict[variant] = material @@ -250,20 +257,20 @@ def render_and_save_NFTs(input): time_start_2 = time.time() # Main paths for batch subfolders: - batchFolder = os.path.join(input.nftBatch_save_path, "Batch" + str(input.batchToGenerate)) + batchFolder = os.path.join(input.nftBatch_save_path, "Batch" + str(input.batchToGenerate)) - imageFolder = os.path.join(batchFolder, "Images") - animationFolder = os.path.join(batchFolder, "Animations") - modelFolder = os.path.join(batchFolder, "Models") - BMNFT_metaData_Folder = os.path.join(batchFolder, "BMNFT_metadata") + imageFolder = os.path.join(batchFolder, "Images") + animationFolder = os.path.join(batchFolder, "Animations") + modelFolder = os.path.join(batchFolder, "Models") + BMNFT_metaData_Folder = os.path.join(batchFolder, "BMNFT_metadata") - imagePath = os.path.join(imageFolder, name) - animationPath = os.path.join(animationFolder, name) - modelPath = os.path.join(modelFolder, name) + imagePath = os.path.join(imageFolder, name) + animationPath = os.path.join(animationFolder, name) + modelPath = os.path.join(modelFolder, name) - cardanoMetadataPath = os.path.join(batchFolder, "Cardano_metadata") - solanaMetadataPath = os.path.join(batchFolder, "Solana_metadata") - erc721MetadataPath = os.path.join(batchFolder, "Erc721_metadata") + cardanoMetadataPath = os.path.join(batchFolder, "Cardano_metadata") + solanaMetadataPath = os.path.join(batchFolder, "Solana_metadata") + erc721MetadataPath = os.path.join(batchFolder, "Erc721_metadata") # Generation/Rendering: if input.enableImages: @@ -357,20 +364,31 @@ def render_and_save_NFTs(input): for obj in bpy.data.collections['Script_Ignore'].all_objects: obj.select_set(True) + # Remove objects from 3D model export: + # remove_objects: list = [ + # ] + # + # for obj in bpy.data.objects: + # if obj.name in remove_objects: + # obj.select_set(False) + if input.modelFileFormat == 'GLB': bpy.ops.export_scene.gltf(filepath=f"{modelPath}.glb", check_existing=True, export_format='GLB', + export_keep_originals=True, use_selection=True) if input.modelFileFormat == 'GLTF_SEPARATE': bpy.ops.export_scene.gltf(filepath=f"{modelPath}", check_existing=True, export_format='GLTF_SEPARATE', + export_keep_originals=True, use_selection=True) if input.modelFileFormat == 'GLTF_EMBEDDED': bpy.ops.export_scene.gltf(filepath=f"{modelPath}.gltf", check_existing=True, export_format='GLTF_EMBEDDED', + export_keep_originals=True, use_selection=True) elif input.modelFileFormat == 'FBX': bpy.ops.export_scene.fbx(filepath=f"{modelPath}.fbx", @@ -392,7 +410,7 @@ def render_and_save_NFTs(input): bpy.ops.export_vox.some_data(filepath=f"{modelPath}.vox") # Loading Animation: - loading = Loader(f'Rendering Animation {x}/{NFTs_in_Batch}...', '').start() + loading = Loader(f'Generating 3D model {x}/{NFTs_in_Batch}...', '').start() generate_models() loading.stop() @@ -406,20 +424,23 @@ def render_and_save_NFTs(input): if input.cardanoMetaDataBool: if not os.path.exists(cardanoMetadataPath): os.makedirs(cardanoMetadataPath) - createCardanoMetadata(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, input.custom_Fields, + createCardanoMetadata(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, + input.custom_Fields, input.enableCustomFields, input.cardano_description, cardanoMetadataPath) if input.solanaMetaDataBool: if not os.path.exists(solanaMetadataPath): os.makedirs(solanaMetadataPath) - createSolanaMetaData(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, input.custom_Fields, - input.enableCustomFields, input.cardano_description, solanaMetadataPath) + createSolanaMetaData(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, + input.custom_Fields, + input.enableCustomFields, input.solana_description, solanaMetadataPath) if input.erc721MetaData: if not os.path.exists(erc721MetadataPath): os.makedirs(erc721MetadataPath) - createErc721MetaData(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, input.custom_Fields, - input.enableCustomFields, input.cardano_description, erc721MetadataPath) + createErc721MetaData(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, + input.custom_Fields, + input.enableCustomFields, input.erc721_description, erc721MetadataPath) if not os.path.exists(BMNFT_metaData_Folder): os.makedirs(BMNFT_metaData_Folder) @@ -457,3 +478,63 @@ def render_and_save_NFTs(input): batch_infoFolder = os.path.join(input.nftBatch_save_path, "Batch" + str(input.batchToGenerate), "batch_info.json") save_batch(batch_info, batch_infoFolder) + + # Send Email that Batch is complete: + if input.emailNotificationBool: + port = 465 # For SSL + smtp_server = "smtp.gmail.com" + sender_email = input.sender_from # Enter your address + receiver_email = input.receiver_to # Enter receiver address + password = input.email_password + + # Get batch info for message: + if input.fail_state: + batch = input.fail_state + batchData = getBatchData(input.failed_batch, input.batch_json_save_path) + + else: + batchData = getBatchData(input.batchToGenerate, input.batch_json_save_path) + + batch = input.batchToGenerate + + generation_time = str(datetime.timedelta(seconds=batch_complete_time)) + + message = f"""\ + Subject: Batch {batch} completed {x - 1} NFTs in {generation_time} (h:m:s) + + Generation Time: + {generation_time.split(':')[0]} Hours, {generation_time.split(':')[1]} Minutes, {generation_time.split(':')[2]} Seconds + Batch Data: + + {batchData} + + This message was sent from an instance of the Blend_My_NFTs Blender add-on. + """ + + context = ssl.create_default_context() + with smtplib.SMTP_SSL(smtp_server, port, context=context) as server: + server.login(sender_email, password) + server.sendmail(sender_email, receiver_email, message) + + # Automatic Shutdown: + # If user selects automatic shutdown but did not specify time after Batch completion + def shutdown(time): + plateform = platform.system() + + if plateform == "Windows": + os.system(f"shutdown /s /t {time}") + if plateform == "Darwin": + os.system(f"shutdown /s /t {time}") + + if input.enableAutoShutdown and not input.specify_timeBool: + shutdown(0) + + # If user selects automatic shutdown and specify time after Batch completion + if input.enableAutoShutdown and input.specify_timeBool: + hours = (int(input.hours) / 60) / 60 + minutes = int(input.minutes) / 60 + total_sleep_time = hours + minutes + + # time.sleep(total_sleep_time) + + shutdown(total_sleep_time) diff --git a/main/Intermediate.py b/main/Intermediate.py index 5606e60..a5479e1 100644 --- a/main/Intermediate.py +++ b/main/Intermediate.py @@ -3,8 +3,8 @@ import bpy from main import DNA_Generator, Exporter -def send_To_Record_JSON(input, reverse_order=False): +def send_To_Record_JSON(input, reverse_order=False): if input.enableLogic: if input.enable_Logic_Json and input.logicFile: input.logicFile = json.load(open(input.logicFile)) @@ -24,10 +24,10 @@ def send_To_Record_JSON(input, reverse_order=False): rule_type = item.rule_type item_list2 = item.item_list2 input.logicFile[f"Rule-{num}"] = { - "Items-1": item_list1.split(','), - "Rule-Type": rule_type, - "Items-2": item_list2.split(',') + "IF": item_list1.split(','), + rule_type: item_list2.split(',') } + print(rule_type) num += 1 else: input.logicFile = {} @@ -37,40 +37,43 @@ def send_To_Record_JSON(input, reverse_order=False): rule_type = item.rule_type item_list2 = item.item_list2 input.logicFile[f"Rule-{num}"] = { - "Items-1": item_list1.split(','), - "Rule-Type": rule_type, - "Items-2": item_list2.split(',') + "IF": item_list1.split(','), + rule_type: item_list2.split(',') } + print(rule_type) + num += 1 - - DNA_Generator.send_To_Record_JSON( input.collectionSize, - input.nftsPerBatch, - input.save_path, - input.enableRarity, - input.enableLogic, - input.logicFile, - input.enableMaterials, - input.materialsFile, - input.Blend_My_NFTs_Output, - input.batch_json_save_path - ) + + DNA_Generator.send_To_Record_JSON(input.collectionSize, + input.nftsPerBatch, + input.save_path, + input.enableRarity, + input.enableLogic, + input.logicFile, + input.enableMaterials, + input.materialsFile, + input.Blend_My_NFTs_Output, + input.batch_json_save_path + ) + def render_and_save_NFTs(input, reverse_order=False): - if input.enableCustomFields: scn = bpy.context.scene if reverse_order: for i in range(scn.custom_metadata_fields_index, -1, -1): item = scn.custom_metadata_fields[i] if item.field_name in list(input.custom_Fields.keys()): - raise ValueError(f"A duplicate of '{item.field_name}' was found. Please ensure all Custom Metadata field Names are unique.") + raise ValueError( + f"A duplicate of '{item.field_name}' was found. Please ensure all Custom Metadata field Names are unique.") else: input.custom_Fields[item.field_name] = item.field_value else: for item in scn.custom_metadata_fields: if item.field_name in list(input.custom_Fields.keys()): - raise ValueError(f"A duplicate of '{item.field_name}' was found. Please ensure all Custom Metadata field Names are unique.") + raise ValueError( + f"A duplicate of '{item.field_name}' was found. Please ensure all Custom Metadata field Names are unique.") else: input.custom_Fields[item.field_name] = item.field_value - Exporter.render_and_save_NFTs(input) \ No newline at end of file + Exporter.render_and_save_NFTs(input) diff --git a/main/Logic.py b/main/Logic.py index 2b835aa..d425c70 100644 --- a/main/Logic.py +++ b/main/Logic.py @@ -2,212 +2,288 @@ # The purpose of this file is to add logic and rules to the DNA that are sent to the NFTRecord.json file in DNA_Generator.py import bpy -import json import random import collections +from .Constants import bcolors, removeList, remove_file_by_extension, save_result -# Helper Functions -def isAttorVar(hierarchy, items_List): - items_returned = collections.defaultdict(list) - for i in items_List: - for j in hierarchy: - if i == j: # If i is an Attribute, add all i Variants to dictionary. - items_returned[i] = list(hierarchy[j].keys()) - items_returned[i].append("Empty") - - if i in list(hierarchy[j].keys()): - items_returned[j].append(i) - - # Check if all variants in an attribute were included, if so, add "Empty" variant. - for i in items_returned: - if list(items_returned[i]) == list(hierarchy[i].keys()): - items_returned[i].append("Empty") - - return dict(items_returned) - -def getAttIndex(hierarchy, attribute): - attList = list(hierarchy.keys()) - index = attList.index(attribute) - return index - -def getVarNum(variant): - if variant == "Empty": - num = '0' - else: - num = variant.split("_")[1] - return num - -def items_to_num(items_List): - num_List = {} - for i in items_List: - variant_num_list = [] - - for j in items_List[i]: - variant_num_list.append(getVarNum(j)) - - num_List[i] = variant_num_list - return num_List - -def rar_selectVar(hierarchy, items_List, deconstructed_DNA): - for attribute in items_List: - - a_attribute_index = getAttIndex(hierarchy, attribute) - - selected_variants = items_List[attribute] - hierarchy_selected_variants = list(hierarchy[attribute]) - - left_over_variants = [x for x in hierarchy_selected_variants if x not in selected_variants] - - if not left_over_variants: - deconstructed_DNA[int(a_attribute_index)] = "0" - else: - number_List_Of_i = [] - rarity_List_Of_i = [] - ifZeroBool = None - variantNum = None - - for a in left_over_variants: - number = a.split("_")[1] - rarity = a.split("_")[2] - - number_List_Of_i.append(int(number)) - rarity_List_Of_i.append(float(rarity)) - - for x in rarity_List_Of_i: - if x == 0: - ifZeroBool = True - elif x != 0: - ifZeroBool = False - - if ifZeroBool: - variantNum = random.choices(number_List_Of_i, k=1) - - if not ifZeroBool: - variantNum = random.choices(number_List_Of_i, weights=rarity_List_Of_i, k=1) - - deconstructed_DNA[int(a_attribute_index)] = str(variantNum[0]) - - return deconstructed_DNA def reconstructDNA(deconstructedDNA): reconstructed_DNA = "" for a in deconstructedDNA: num = "-" + str(a) reconstructed_DNA += num - return (''.join(reconstructed_DNA.split('-', 1))) + return ''.join(reconstructed_DNA.split('-', 1)) -def strip_empty_variant(num_list): - """Strips empty variants if full attribute collection. Used for processing below.""" - for i in num_list: - var_list = num_list[i] - if "0" in var_list: - var_list.remove("0") - num_list[i] = var_list - return num_list -# Rule Checks: -def never_with_Rule_Check(hierarchy, deconstructed_DNA, num_List1, num_List2): - """Returns True if singleDNA violates Never with Rule stated in Logic.json.""" - violates_rule = None +def get_var_info(variant, hierarchy): + # Get info for variant dict + name = variant.split("_")[0] + order_number = variant.split("_")[1] + rarity_number = variant.split("_")[2] + attribute = "" - num_List1 = strip_empty_variant(num_List1) - num_List2 = strip_empty_variant(num_List2) + for a in hierarchy: + for var in list(hierarchy[a].keys()): + if var == variant: + attribute = a + break + attribute_index = list(hierarchy.keys()).index(attribute) - for a in num_List1: - for b in num_List2: - if str(deconstructed_DNA[getAttIndex(hierarchy, a)]) in num_List1[a] and \ - str(deconstructed_DNA[getAttIndex(hierarchy, b)]) in num_List2[b]: - violates_rule = True - return violates_rule - else: - violates_rule = False - return violates_rule + return [name, order_number, rarity_number, attribute, attribute_index] # list of Var info sent back -def only_with_Rule_Check(hierarchy, deconstructed_DNA, num_List1, num_List2): - """Returns True if singleDNA violates Only with Rule stated in Logic.json.""" - violates_rule = None - for a in num_List1: - for b in num_List2: - if str(deconstructed_DNA[getAttIndex(hierarchy, a)]) in num_List1[a] and \ - str(deconstructed_DNA[getAttIndex(hierarchy, b)]) not in num_List2[b]: - violates_rule = True - return violates_rule +def apply_rules_to_dna(hierarchy, deconstructed_DNA, if_dict, result_dict, result_dict_type, enableRarity): + # Check if Variants in if_dict are in deconstructed_DNA, if so return if_list_selected = True: + if_list_selected = False + for a in deconstructed_DNA: + attribute_index = deconstructed_DNA.index(a) + attribute = list(hierarchy.keys())[attribute_index] - else: - violates_rule = False - return violates_rule + for b in hierarchy[attribute]: + if hierarchy[attribute][b]["number"] == a: + a_dna_var = b -def always_with_Rule_Check(hierarchy, deconstructed_DNA, num_List1, num_List2): - """Returns True if singleDNA violates Always with Rule stated in Logic.json.""" - violates_rule = None + if attribute in if_dict: + if a_dna_var in list(if_dict[attribute].keys()): + if_list_selected = True - for a in num_List2: - if str(deconstructed_DNA[getAttIndex(hierarchy, a)]) not in num_List2[a]: - violates_rule = True - return violates_rule + # Apply changes in accordance to Variants in 'result_dict' and 'if_list_selected' bool above: + for a in deconstructed_DNA: + attribute_index = deconstructed_DNA.index(a) + attribute = list(hierarchy.keys())[attribute_index] + + if attribute in result_dict: # Check if Attribute from DNA is in 'result_dict' + + # If 'a' is a full Attribute and Variants in if_dict not selected, set 'a' to empty (0): + if list(result_dict[attribute].keys()) == list(hierarchy[attribute].keys()) and not if_list_selected: + deconstructed_DNA[attribute_index] = "0" + + # If 'a' is a full Attribute and result_dict_type = "NOT", set 'a' to empty (0): + if list(result_dict[attribute].keys()) == list( + hierarchy[attribute].keys()) and if_list_selected and result_dict_type == "NOT": + deconstructed_DNA[attribute_index] = "0" + + # If Variants in if_dict are selected, set each attribute in 'result_dict' to a random or rarity selected Variant from + # 'result_dict[attribute]' variant_list: + if if_list_selected: + + # Invert 'items_returned' if 'NOT' rule is selected: + if result_dict_type == "NOT": + for a in result_dict: + var_selected_list = list(result_dict[a].keys()) # list of variants from 'NOT' + att_selected_list = list(hierarchy[a].keys()) # full list of variants from hierarchy attribute + + # If 'a' is not a full Attribute, invert the variants: + if len(var_selected_list) != len(att_selected_list): + var_selected_list = [i for i in att_selected_list if i not in var_selected_list] + + var_selected_list_complete = {} + for i in var_selected_list: + var_selected_list_complete[i] = get_var_info(i, hierarchy) + result_dict[a] = var_selected_list_complete + + for a in result_dict: + attribute_index = list(hierarchy.keys()).index(a) + attribute = list(hierarchy.keys())[attribute_index] + + variant_list = list(result_dict[a].keys()) + + if attribute in result_dict: # Check if Attribute from DNA is in 'then_dict' + + number_List_Of_i = [] + rarity_List_Of_i = [] + ifZeroBool = None + variantNum = None + + for b in variant_list: + number = b.split("_")[1] + rarity = b.split("_")[2] + + number_List_Of_i.append(int(number)) + rarity_List_Of_i.append(float(rarity)) + + for b in rarity_List_Of_i: + if b == 0: + ifZeroBool = True + elif b != 0: + ifZeroBool = False + + if enableRarity: + try: + if ifZeroBool: + variantNum = random.choices(number_List_Of_i, k=1) + elif not ifZeroBool: + variantNum = random.choices(number_List_Of_i, weights=rarity_List_Of_i, k=1) + except IndexError: + raise IndexError( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"An issue was found within the Attribute collection '{a}'. For more information on Blend_My_NFTs compatible scenes, " + f"see:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) + else: + try: + variantNum = random.choices(number_List_Of_i, k=1) + except IndexError: + raise IndexError( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"An issue was found within the Attribute collection '{a}'. For more information on Blend_My_NFTs compatible scenes, " + f"see:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) + deconstructed_DNA[int(attribute_index)] = str(variantNum[0]) + + return deconstructed_DNA + + +def get_rule_break_type(hierarchy, deconstructed_DNA, if_dict, result_dict, result_dict_type): + # Check if Variants in 'if_dict' found in deconstructed_DNA: + if_bool = False # True if Variant in 'deconstructed_DNA' found in 'if_dict' + for a in if_dict: # Attribute in 'if_dict' + for b in if_dict[a]: # Variant in if_dict[Attribute] + var_order_num = str(if_dict[a][b][1]) # Order number of 'b' (Variant) + dna_order_num = str( + deconstructed_DNA[if_dict[a][b][4]]) # Order Number of 'b's attribute in deconstructed_DNA + + if var_order_num == dna_order_num: # If DNA selected Variants found inside IF list variants: + if_bool = True + break else: - violates_rule = False - return violates_rule + continue + break + + # Check if Variants in 'result_dict' found in deconstructed_DNA: + full_att_bool = False + result_bool = False # True if Variant in 'deconstructed_DNA' found in 'result_dict' + for a in result_dict: # Attribute in 'result_dict' + for b in result_dict[a]: # Variant in if_dict[Attribute] + var_order_num = str(result_dict[a][b][1]) # Order number of 'b' (Variant) + dna_order_num = str( + deconstructed_DNA[result_dict[a][b][4]]) # Order Number of 'b's attribute in deconstructed_DNA + if var_order_num == dna_order_num: # If DNA selected Variants found inside THEN list variants: + if list(result_dict[a].keys()) == list(hierarchy[a].keys()): + full_att_bool = True + result_bool = True + break + else: + continue + break + + # Rule Bool return summary: + violates_rule = False + + # If Variants in 'if_dict' found in deconstructed_DNA and Variants in 'result_dict' not found in deconstructed_DNA: + if if_bool and not result_bool: + violates_rule = True + + elif if_bool and result_bool and result_dict_type == "NOT": + violates_rule = True + + # If Variants in 'if_dict' not found in deconstructed_DNA, and 'result_dict' variants are found in deconstructed_DNA, + # and they are a part of a full Attribute in 'then_dict' + elif not if_bool and result_bool and full_att_bool: + violates_rule = True + + # If Variants in 'if_dict' not found in deconstructed_DNA, but Variants in 'then_dict' are found in deconstructed_DNA, + # and don't make up a full Attribute: + # elif not if_bool and result_bool and not full_att_bool: + # violates_rule = False + + return violates_rule, if_bool, result_bool, full_att_bool -# Main Function -def logicafyDNAsingle(hierarchy, singleDNA, logicFile): +def create_dicts(hierarchy, rule_list_items, result_dict_type): + # Example of output structure: + structure = { + "attribute1": { + "variant1": [ + "name", + "order_number", + "rarity_number" + "attribute" + "attribute_index" + ], + "variant2": [ + "name", + "order_number", + "rarity_number" + "attribute" + "attribute_index" + ] + }, + "attribute2": { + "variant1": [ + "name", + "order_number", + "rarity_number" + "attribute" + "attribute_index" + ], + "variant2": [ + "name", + "order_number", + "rarity_number" + "attribute" + "attribute_index" + ] + } + } + items_returned = collections.defaultdict(dict) + for a in rule_list_items: + for b in hierarchy: + if a == b: # If 'a' is an Attribute, add all 'a' Variants to items_returned dict. + variant_list_of_a = list(hierarchy[a].keys()) + variant_dict_of_a = {} + for c in variant_list_of_a: + variant_dict_of_a[c] = get_var_info(c, hierarchy) + + items_returned[a] = variant_dict_of_a + + if a in list(hierarchy[b].keys()): # If 'a' is a Variant, add all info about that variant to items_returned + items_returned[b][a] = get_var_info(a, hierarchy) + + items_returned = dict(items_returned) + + return dict(items_returned) + + +def logicafyDNAsingle(hierarchy, singleDNA, logicFile, enableRarity, enableMaterials): deconstructed_DNA = singleDNA.split("-") - didReconstruct = True originalDNA = str(singleDNA) while didReconstruct: didReconstruct = False for rule in logicFile: - items_List1 = isAttorVar(hierarchy, logicFile[rule]["Items-1"]) - items_List2 = isAttorVar(hierarchy, logicFile[rule]["Items-2"]) - num_List1 = items_to_num(items_List1) - num_List2 = items_to_num(items_List2) + # Items from 'IF' key for a given rule + if_dict = create_dicts(hierarchy, logicFile[rule]["IF"], "IF") - if logicFile[rule]["Rule-Type"] == "Never With": - if never_with_Rule_Check(hierarchy, deconstructed_DNA, num_List1, num_List2): + result_dict_type = "" + if "THEN" in logicFile[rule]: + result_dict_type = "THEN" - rand_bool = bool(random.getrandbits(1)) + if "NOT" in logicFile[rule]: + result_dict_type = "NOT" - if rand_bool: - deconstructed_DNA = rar_selectVar(hierarchy, items_List2, deconstructed_DNA) + result_dict = create_dicts(hierarchy, logicFile[rule][result_dict_type], result_dict_type) - if not rand_bool: - deconstructed_DNA = rar_selectVar(hierarchy, items_List1, deconstructed_DNA) + # Change 'then_bool' to 'result_bool' + violates_rule, if_bool, then_bool, full_att_bool = get_rule_break_type(hierarchy, deconstructed_DNA, + if_dict, result_dict, + result_dict_type) + if violates_rule: + # print(f"======={deconstructed_DNA} VIOLATES RULE======") - newDNA = reconstructDNA(deconstructed_DNA) - if newDNA != originalDNA: - originalDNA = str(newDNA) - didReconstruct = True - break + deconstructed_DNA = apply_rules_to_dna( + hierarchy, deconstructed_DNA, if_dict, result_dict, result_dict_type, enableRarity + ) - if logicFile[rule]["Rule-Type"] == "Only With": - if only_with_Rule_Check(hierarchy, deconstructed_DNA, num_List1, num_List2): - for b in num_List1: - if "0" in num_List1[b]: # If complete attribute - deconstructed_DNA[getAttIndex(hierarchy, b)] = "0" - - if "0" not in num_List1[b]: # Not complete attribute, select from other variants with rarity: - deconstructed_DNA = rar_selectVar(hierarchy, items_List1, deconstructed_DNA) - - newDNA = reconstructDNA(deconstructed_DNA) - if newDNA != originalDNA: - originalDNA = str(newDNA) - didReconstruct = True - break - - if logicFile[rule]["Rule-Type"] == "Always With": - if always_with_Rule_Check(hierarchy, deconstructed_DNA, num_List1, num_List2): - deconstructed_DNA = rar_selectVar(hierarchy, items_List1, deconstructed_DNA) - - newDNA = reconstructDNA(deconstructed_DNA) - if newDNA != originalDNA: - originalDNA = str(newDNA) - didReconstruct = True - break + newDNA = reconstructDNA(deconstructed_DNA) + if newDNA != originalDNA: + originalDNA = str(newDNA) + didReconstruct = True + break return str(reconstructDNA(deconstructed_DNA)) diff --git a/main/Material_Generator.py b/main/Material_Generator.py index cb02756..c5866ad 100644 --- a/main/Material_Generator.py +++ b/main/Material_Generator.py @@ -7,36 +7,58 @@ import bpy import json import random +from .Constants import bcolors, removeList, remove_file_by_extension, save_result -def select_material(materialList): +def select_material(materialList, variant, enableRarity): """Selects a material from a passed material list. """ - - number_List_Of_i = [] + material_List_Of_i = [] # List of Material names instead of order numbers rarity_List_Of_i = [] ifZeroBool = None for material in materialList: + # Material Order Number comes from index in the Material List in materials.json for a given Variant. + # material_order_num = list(materialList.keys()).index(material) - material_order_num = material.split("_")[1] - number_List_Of_i.append(material_order_num) + material_List_Of_i.append(material) - material_rarity_percent = material.split("_")[1] + material_rarity_percent = materialList[material] rarity_List_Of_i.append(float(material_rarity_percent)) - for x in rarity_List_Of_i: - if x == 0: + print(f"MATERIAL_LIST_OF_I:{material_List_Of_i}") + print(f"RARITY_LIST_OF_I:{rarity_List_Of_i}") + + for b in rarity_List_Of_i: + if b == 0: ifZeroBool = True - break - elif x != 0: + elif b != 0: ifZeroBool = False - if ifZeroBool: - selected_material = random.choices(number_List_Of_i, k=1) - elif not ifZeroBool: - selected_material = random.choices(number_List_Of_i, weights=rarity_List_Of_i, k=1) + if enableRarity: + try: + if ifZeroBool: + selected_material = random.choices(material_List_Of_i, k=1) + elif not ifZeroBool: + selected_material = random.choices(material_List_Of_i, weights=rarity_List_Of_i, k=1) + except IndexError: + raise IndexError( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"An issue was found within the Material List of the Variant collection '{variant}'. For more information on Blend_My_NFTs compatible scenes, " + f"see:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) + else: + try: + selected_material = random.choices(material_List_Of_i, k=1) + except IndexError: + raise IndexError( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"An issue was found within the Material List of the Variant collection '{variant}'. For more information on Blend_My_NFTs compatible scenes, " + f"see:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) - return selected_material[0] + return selected_material[0], materialList def get_variant_att_index(variant, hierarchy): variant_attribute = None @@ -69,7 +91,7 @@ def match_DNA_to_Variant(hierarchy, singleDNA): dnaDictionary.update({x: k}) return dnaDictionary -def apply_materials(hierarchy, singleDNA, materialsFile): +def apply_materials(hierarchy, singleDNA, materialsFile, enableRarity): """ DNA with applied material example: "1-1:1-1" : @@ -85,16 +107,28 @@ def apply_materials(hierarchy, singleDNA, materialsFile): complete = False for b in materialsFile: if singleDNADict[a] == b: - mat = select_material(materialsFile[b]['Material List']) - deconstructed_MaterialDNA[a] = mat + material_name, materialList, = select_material(materialsFile[b]['Material List'], b, enableRarity) + material_order_num = list(materialList.keys()).index(material_name) # Gets the Order Number of the Material + deconstructed_MaterialDNA[a] = str(material_order_num + 1) complete = True if not complete: deconstructed_MaterialDNA[a] = "0" + # This section is now incorrect and needs updating: + + # Make Attributes have the same materials: + # Order your Attributes alphabetically, then assign each Attribute a number, starting with 0. So Attribute 'A' = 0, + # Attribute 'B' = 1, 'C' = 2, 'D' = 3, etc. For each pair you want to equal another, add its number it to this list: + # synced_material_attributes = [1, 2] + # + # first_mat = deconstructed_MaterialDNA[synced_material_attributes[0]] + # for i in synced_material_attributes: + # deconstructed_MaterialDNA[i] = first_mat + material_DNA = "" for a in deconstructed_MaterialDNA: num = "-" + str(deconstructed_MaterialDNA[a]) material_DNA += num material_DNA = ''.join(material_DNA.split('-', 1)) - return f"{singleDNA}:{material_DNA}" + return f"{singleDNA}:{material_DNA}" \ No newline at end of file diff --git a/main/Rarity.py b/main/Rarity.py index 2cabc49..eff5c7c 100644 --- a/main/Rarity.py +++ b/main/Rarity.py @@ -4,6 +4,9 @@ import bpy import random +from .Constants import bcolors, removeList, remove_file_by_extension + + def createDNArarity(hierarchy): """ Sorts through DataDictionary and appropriately weights each variant based on their rarity percentage set in Blender @@ -14,7 +17,6 @@ def createDNArarity(hierarchy): for i in hierarchy: number_List_Of_i = [] rarity_List_Of_i = [] - count = 0 ifZeroBool = None for k in hierarchy[i]: @@ -24,19 +26,25 @@ def createDNArarity(hierarchy): rarity = hierarchy[i][k]["rarity"] rarity_List_Of_i.append(float(rarity)) - count += 1 - for x in rarity_List_Of_i: if x == 0: ifZeroBool = True elif x != 0: ifZeroBool = False - if ifZeroBool: - variantByNum = random.choices(number_List_Of_i, k=1) - elif not ifZeroBool: - variantByNum = random.choices(number_List_Of_i, weights=rarity_List_Of_i, k=1) + try: + if ifZeroBool: + variantByNum = random.choices(number_List_Of_i, k=1) + elif not ifZeroBool: + variantByNum = random.choices(number_List_Of_i, weights=rarity_List_Of_i, k=1) + except IndexError: + raise IndexError( + f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n" + f"An issue was found within the Attribute collection '{i}'. For more information on Blend_My_NFTs compatible scenes, " + f"see:\n{bcolors.RESET}" + f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n" + ) singleDNA += "-" + str(variantByNum[0]) singleDNA = ''.join(singleDNA.split('-', 1)) - return singleDNA + return singleDNA \ No newline at end of file From fa1c2c5883a9838d0ee934d58099824e60e8b7a3 Mon Sep 17 00:00:00 2001 From: Torrin Leonard <82110564+torrinworx@users.noreply.github.com> Date: Thu, 11 Aug 2022 10:26:16 -0400 Subject: [PATCH 4/7] Fixing issues with failed batches Resolved conflicts with failed batches recovery related to headless variable integration from previous merge. Fixed issue where animations would get corrupt when resuming failed batch. The last file to be generated will now be deleted if it exists to avoid this issue. --- __init__.py | 62 ++++++++++++++++++++++++++++++++---------------- main/Exporter.py | 46 ++++++++++++++++++++++++++++++++--- 2 files changed, 85 insertions(+), 23 deletions(-) diff --git a/__init__.py b/__init__.py index 6d4a0e8..48fed35 100644 --- a/__init__.py +++ b/__init__.py @@ -551,42 +551,64 @@ class resume_failed_batch(bpy.types.Operator): _fail_state, _failed_batch, _failed_dna, _failed_dna_index = Checks.check_FailedBatches(_batch_json_save_path) + render_settings = batchData["Generation Save"][-1]["Render_Settings"] + input = BMNFTData( - nftName=batchData["Generation Save"][-1]["Render_Settings"]["nftName"], + nftName=render_settings["nftName"], save_path=_save_path, - collectionSize=batchData["Generation Save"][-1]["Render_Settings"]["collectionSize"], + nftsPerBatch=render_settings["nftsPerBatch"], + batchToGenerate=render_settings["batchToGenerate"], + collectionSize=render_settings["collectionSize"], Blend_My_NFTs_Output=_Blend_My_NFTs_Output, batch_json_save_path=_batch_json_save_path, - nftBatch_save_path=batchData["Generation Save"][-1]["Render_Settings"]["nftBatch_save_path"], + nftBatch_save_path=render_settings["nftBatch_save_path"], - enableImages=batchData["Generation Save"][-1]["Render_Settings"]["enableImages"], - imageFileFormat=batchData["Generation Save"][-1]["Render_Settings"]["imageFileFormat"], + enableImages=render_settings["enableImages"], + imageFileFormat=render_settings["imageFileFormat"], - enableAnimations=batchData["Generation Save"][-1]["Render_Settings"]["enableAnimations"], - animationFileFormat=batchData["Generation Save"][-1]["Render_Settings"]["animationFileFormat"], + enableAnimations=render_settings["enableAnimations"], + animationFileFormat=render_settings["animationFileFormat"], - enableModelsBlender=batchData["Generation Save"][-1]["Render_Settings"]["enableModelsBlender"], - modelFileFormat=batchData["Generation Save"][-1]["Render_Settings"]["modelFileFormat"], + enableModelsBlender=render_settings["enableModelsBlender"], + modelFileFormat=render_settings["modelFileFormat"], - enableCustomFields=batchData["Generation Save"][-1]["Render_Settings"]["enableCustomFields"], - custom_Fields=batchData["Generation Save"][-1]["Render_Settings"]["custom_Fields"], + enableCustomFields=render_settings["enableCustomFields"], - cardanoMetaDataBool=batchData["Generation Save"][-1]["Render_Settings"]["cardanoMetaDataBool"], - solanaMetaDataBool=batchData["Generation Save"][-1]["Render_Settings"]["solanaMetaDataBool"], - erc721MetaData=batchData["Generation Save"][-1]["Render_Settings"]["erc721MetaData"], + cardanoMetaDataBool=render_settings["cardanoMetaDataBool"], + solanaMetaDataBool=render_settings["solanaMetaDataBool"], + erc721MetaData=render_settings["erc721MetaData"], - cardano_description=batchData["Generation Save"][-1]["Render_Settings"]["cardano_description"], - solana_description=batchData["Generation Save"][-1]["Render_Settings"]["solana_description"], - erc721_description=batchData["Generation Save"][-1]["Render_Settings"]["erc721_description"], + cardano_description=render_settings["cardano_description"], + solana_description=render_settings["solana_description"], + erc721_description=render_settings["erc721_description"], - enableMaterials=batchData["Generation Save"][-1]["Render_Settings"]["enableMaterials"], - materialsFile=batchData["Generation Save"][-1]["Render_Settings"]["materialsFile"], + enableMaterials=render_settings["enableMaterials"], + materialsFile=render_settings["materialsFile"], + + enableLogic=render_settings["enableLogic"], + enable_Logic_Json=render_settings["enable_Logic_Json"], + logicFile=render_settings["logicFile"], + + enableRarity=render_settings["enableRarity"], + + enableAutoShutdown=render_settings["enableAutoShutdown"], + + specify_timeBool=render_settings["specify_timeBool"], + hours=render_settings["hours"], + minutes=render_settings["minutes"], + + emailNotificationBool=render_settings["emailNotificationBool"], + sender_from=render_settings["sender_from"], + email_password=render_settings["email_password"], + receiver_to=render_settings["receiver_to"], fail_state=_fail_state, failed_batch=_failed_batch, failed_dna=_failed_dna, - failed_dna_index=_failed_dna_index + failed_dna_index=_failed_dna_index, + + custom_Fields=render_settings["custom_Fields"], ) Exporter.render_and_save_NFTs(input) diff --git a/main/Exporter.py b/main/Exporter.py index dea61bf..76a67cb 100644 --- a/main/Exporter.py +++ b/main/Exporter.py @@ -43,9 +43,9 @@ def save_generation_state(input): "DNA Generated": None, "Generation Start Date and Time": [CURRENT_TIME, CURRENT_DATE, LOCAL_TIMEZONE], "Render_Settings": { - "nftName": input.nftName, "save_path": input.save_path, + "nftsPerBatch": input.nftsPerBatch, "batchToGenerate": input.batchToGenerate, "collectionSize": input.collectionSize, @@ -63,7 +63,6 @@ def save_generation_state(input): "modelFileFormat": input.modelFileFormat, "enableCustomFields": input.enableCustomFields, - "custom_Fields": input.custom_Fields, "cardanoMetaDataBool": input.cardanoMetaDataBool, "solanaMetaDataBool": input.solanaMetaDataBool, @@ -76,6 +75,24 @@ def save_generation_state(input): "enableMaterials": input.enableMaterials, "materialsFile": input.materialsFile, + "enableLogic": input.enableLogic, + "enable_Logic_Json": input.enable_Logic_Json, + "logicFile": input.logicFile, + + "enableRarity": input.enableRarity, + + "enableAutoShutdown": input.enableAutoShutdown, + + "specify_timeBool": input.specify_timeBool, + "hours": input.hours, + "minutes": input.minutes, + + "emailNotificationBool": input.emailNotificationBool, + "sender_from": input.sender_from, + "email_password": input.email_password, + "receiver_to": input.receiver_to, + + "custom_Fields": input.custom_Fields, }, }) @@ -272,12 +289,24 @@ def render_and_save_NFTs(input): solanaMetadataPath = os.path.join(batchFolder, "Solana_metadata") erc721MetadataPath = os.path.join(batchFolder, "Erc721_metadata") + + def check_failed_exists(file_path): + # Delete a file if a fail state is detected and if the file being re-generated already exists. Prevents + # animations from corrupting. + + if input.fail_state: + if os.path.exists(file_path): + os.remove(file_path) + # Generation/Rendering: if input.enableImages: + print(f"{bcolors.OK}---Image---{bcolors.RESET}") image_render_time_start = time.time() + check_failed_exists(imagePath) + def render_image(): if not os.path.exists(imageFolder): os.makedirs(imageFolder) @@ -302,6 +331,8 @@ def render_and_save_NFTs(input): animation_render_time_start = time.time() + check_failed_exists(animationPath) + def render_animation(): if not os.path.exists(animationFolder): os.makedirs(animationFolder) @@ -373,40 +404,49 @@ def render_and_save_NFTs(input): # obj.select_set(False) if input.modelFileFormat == 'GLB': + check_failed_exists(f"{modelPath}.glb") bpy.ops.export_scene.gltf(filepath=f"{modelPath}.glb", check_existing=True, export_format='GLB', export_keep_originals=True, use_selection=True) if input.modelFileFormat == 'GLTF_SEPARATE': + check_failed_exists(f"{modelPath}.gltf") + check_failed_exists(f"{modelPath}.bin") bpy.ops.export_scene.gltf(filepath=f"{modelPath}", check_existing=True, export_format='GLTF_SEPARATE', export_keep_originals=True, use_selection=True) if input.modelFileFormat == 'GLTF_EMBEDDED': + check_failed_exists(f"{modelPath}.gltf") bpy.ops.export_scene.gltf(filepath=f"{modelPath}.gltf", check_existing=True, export_format='GLTF_EMBEDDED', export_keep_originals=True, use_selection=True) elif input.modelFileFormat == 'FBX': + check_failed_exists(f"{modelPath}.fbx") bpy.ops.export_scene.fbx(filepath=f"{modelPath}.fbx", check_existing=True, use_selection=True) elif input.modelFileFormat == 'OBJ': + check_failed_exists(f"{modelPath}.obj") bpy.ops.export_scene.obj(filepath=f"{modelPath}.obj", check_existing=True, use_selection=True, ) elif input.modelFileFormat == 'X3D': + check_failed_exists(f"{modelPath}.x3d") bpy.ops.export_scene.x3d(filepath=f"{modelPath}.x3d", check_existing=True, use_selection=True) elif input.modelFileFormat == 'STL': + check_failed_exists(f"{modelPath}.stl") bpy.ops.export_mesh.stl(filepath=f"{modelPath}.stl", check_existing=True, use_selection=True) elif input.modelFileFormat == 'VOX': + check_failed_exists(f"{modelPath}.vox") bpy.ops.export_vox.some_data(filepath=f"{modelPath}.vox") # Loading Animation: @@ -417,7 +457,7 @@ def render_and_save_NFTs(input): model_generation_time_end = time.time() print( - f"{bcolors.OK}Generated model in {model_generation_time_end - model_generation_time_start}s.\n{bcolors.RESET}" + f"{bcolors.OK}Generated 3D model in {model_generation_time_end - model_generation_time_start}s.\n{bcolors.RESET}" ) # Generating Metadata: From 57f8d6defadc2d8b6b24e62ab85f5bf60614db57 Mon Sep 17 00:00:00 2001 From: Torrin Leonard <82110564+torrinworx@users.noreply.github.com> Date: Thu, 11 Aug 2022 11:39:32 -0400 Subject: [PATCH 5/7] Update README.md --- README.md | 17 ++++++----------- 1 file changed, 6 insertions(+), 11 deletions(-) diff --git a/README.md b/README.md index e9a7a35..87f98f7 100644 --- a/README.md +++ b/README.md @@ -11,18 +11,17 @@

## Description -Blend_My_NFTs is an open source, free to use Blender add on that enables you to automatically generate thousands of 3D Models, Animations, and Images. This add on's primary purpose is to aid in the creation of large generative 3D NFT collections. +Blend_My_NFTs is an open source, free to use Blender add-on that enables you to easily generate thousands of 3D Models, Animations, and Images. This add on's primary purpose is to aid in the creation of large generative 3D NFT collections. It is the first and easiest 3D NFT generator. -For support, help, and questions, please join our wonderful Discord community: https://discord.gg/UpZt5Un57t +For support, help, and questions, please join our wonderful [Discord community!](https://discord.gg/UpZt5Un57t) -Checkout the newest tutorial on YouTube that goes along with this documentation: https://www.youtube.com/watch?v=ygKJYz4BjRs - -This add on was developed to create the This Cozy Place NFT project which is now availabe to mint on [ThisCozyStudio.com](https://www.thiscozystudio.com/cozy-mint/) +[Checkout the newest YouTube tutorial!](https://youtu.be/ygKJYz4BjRs) +Blend_My_NFTs was initially developed to create Cozy Place, an NFT collection by This Cozy Studio Inc. https://user-images.githubusercontent.com/82110564/147833465-965be08b-ca5f-47ba-a159-b92ff775ee14.mov -The video above illustrates the first 10 Cozy Place NFTs generated with Blend_My_NFts. +The video above illustrates the first 10 Cozy Place NFTs generated with a very early prototype version of Blend_My_NFTs. ## Official Links: @@ -41,7 +40,7 @@ Reddit: https://www.reddit.com/r/ThisCozyPlace/ ## Quick Disclaimer -Blend_My_NFTs works with Blender 3.1.0+ on Windows 10 or macOS Big Sur 11.6. Linux is supported, however we haven't had the chance to test and guarantee this functionality. Any rendering engine works; Cycles, Eevee, and Octane have all been used by the community. This add-on only works in Blender, a Cinima 4D port is being investigated. +Blend_My_NFTs works with Blender 3.2.2 on Windows 10 or macOS Big Sur 11.6. Linux is supported, however we haven't had the chance to test and guarantee this functionality. Windows 11 has not been tested as of yet. Any rendering engine works; Cycles, Eevee, and Octane have all been used by the community. This add-on only works in Blender, a Cinima 4D port is being investigated. ## Example Files The YouTube tutorials use three different .blend example files. This repository has all three and includes a readme.md file that outlines which videos use which files and by what date: https://github.com/torrinworx/BMNFTs_Examples @@ -68,11 +67,7 @@ The YouTube tutorials use three different .blend example files. This repository - [Custom Fields Schema](#custom-fields-schema) - [Logic](#logic) - [Logic JSON File Schema](#logic-json-schema) - - [Schema Definition](#schema-definition) - - [Rule Types](#rule-types) - [Example Logic.json File](#example-logicjson-file) - - [Never with, Logic Rule Examples](#never-with-logic-rule-examples) - - [Only with, Logic Rule Examples](#only-with-logic-rule-examples) - [Notes on Rarity and Weighted Variants](#notes-on-rarity-and-weighted-variants) - [.Blend File Rarity Example](#blend-file-rarity-examples) - [More complex Rarity Example](#more-complex-rarity-example) From 9dbe3bbe469e14ce9b71894b6c561d7549adb286 Mon Sep 17 00:00:00 2001 From: Torrin Leonard <82110564+torrinworx@users.noreply.github.com> Date: Thu, 11 Aug 2022 12:01:56 -0400 Subject: [PATCH 6/7] Fixing Logic UI Layout --- UILists/Logic_UIList.py | 18 ++++++++++-------- 1 file changed, 10 insertions(+), 8 deletions(-) diff --git a/UILists/Logic_UIList.py b/UILists/Logic_UIList.py index 8bed03f..0f82f42 100644 --- a/UILists/Logic_UIList.py +++ b/UILists/Logic_UIList.py @@ -58,7 +58,7 @@ class CUSTOM_OT_logic_actions(Operator): if self.action == 'ADD': if context.object: item = scn.logic_fields.add() - item.name = "Rule" # The name of each object + item.name = "Rule" scn.logic_fields_index = len(scn.logic_fields) - 1 info = '"%s" added to list' % (item.name) self.report({'INFO'}, info) @@ -93,14 +93,16 @@ class CUSTOM_OT_logic_clearList(Operator): # ======== UILists ======== # class CUSTOM_UL_logic_items(UIList): def draw_item(self, context, layout, data, item, icon, active_data, active_propname, index): - split = layout.split(factor=0.1) - split.label(text=f"{index + 1}") - row = split.row() - row.label(text=item.name) # avoids renaming the item by accident - row.prop(item, "item_list1", text="") + layout = layout.split(factor=0.1) + col = layout.column() + col.label(text=f" Rule {index + 1}") - row.prop(item, "rule_type", text="") - row.prop(item, "item_list2", text="") + col = layout.column() + col.label(text="") + col.prop(item, "item_list1", text="") + + col.prop(item, "rule_type", text="") + col.prop(item, "item_list2", text="") def invoke(self, context, event): pass From 54f57b794f7a9cd3f8b4ad0b113112881c295d06 Mon Sep 17 00:00:00 2001 From: Torrin Leonard <82110564+torrinworx@users.noreply.github.com> Date: Thu, 11 Aug 2022 12:19:39 -0400 Subject: [PATCH 7/7] Update README.md --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 87f98f7..34885ca 100644 --- a/README.md +++ b/README.md @@ -19,8 +19,10 @@ For support, help, and questions, please join our wonderful [Discord community!] Blend_My_NFTs was initially developed to create Cozy Place, an NFT collection by This Cozy Studio Inc. + https://user-images.githubusercontent.com/82110564/147833465-965be08b-ca5f-47ba-a159-b92ff775ee14.mov + The video above illustrates the first 10 Cozy Place NFTs generated with a very early prototype version of Blend_My_NFTs. @@ -50,8 +52,6 @@ The YouTube tutorials use three different .blend example files. This repository - [Blend_My_NFTs](#blend_my_nfts) - [Description](#description) - [Official Links](#official-links) - - [Case Studies](#case-studies) - - [Donations](#donations) - [Quick Disclaimer](#quick-disclaimer) - [Example Files](#example-files) - [Table of Contents](#table-of-contents)