kopia lustrzana https://github.com/torrinworx/Blend_My_NFTs
commit
7d1e1eb8f2
165
README.md
165
README.md
|
@ -15,9 +15,9 @@ Blend_My_NFTs is an open source, free to use Blender add on that enables you to
|
|||
|
||||
For support, help, and questions, please join our wonderful Discord community: https://discord.gg/UpZt5Un57t
|
||||
|
||||
Checkout the newest tutorial on YouTube that goes along with this documentation: https://www.youtube.com/watch?v=SwU4iVy1XpU
|
||||
Checkout the newest tutorial on YouTube that goes along with this documentation: https://www.youtube.com/watch?v=ygKJYz4BjRs
|
||||
|
||||
This add on was developed to create the This Cozy Place NFT project which is now availabe to mint on [ThisCozyStudio.com](https://thiscozystudio.com/)
|
||||
This add on was developed to create the This Cozy Place NFT project which is now availabe to mint on [ThisCozyStudio.com](https://www.thiscozystudio.com/cozy-mint/)
|
||||
|
||||
|
||||
https://user-images.githubusercontent.com/82110564/147833465-965be08b-ca5f-47ba-a159-b92ff775ee14.mov
|
||||
|
@ -40,37 +40,8 @@ Instagram: https://www.instagram.com/this_cozy_studio/
|
|||
Reddit: https://www.reddit.com/r/ThisCozyPlace/
|
||||
|
||||
|
||||
## Case Studies
|
||||
This document has a list of projects that use Blend_My_NFTs to help facilitate them in the creation of their collection:
|
||||
https://docs.google.com/document/d/e/2PACX-1vSHZS4GRu8xXDYpVPEaxyBeTzms9yrJEC9IoAcP38_U8x0C1kVrbtNZgh0zUmkzBoZQVwNvBf3ldRij/pub
|
||||
|
||||
|
||||
## Donations
|
||||
|
||||
Blend_My_NFTs, this readme documenation, YouTube tutorials, live stream Q/As, and the Discord community are all provided for free by This Cozy Studio for anyone to use and access. We only ask in return that you credit this software and kindly share what our team has built. A direct link to the Blend_My_NFTs Github page on your projects website (or equivelant social platform) would sefice. We ask you to share this tool because we feel there are many out there that would benefit from it, our only goal is to help those in need. It warms our hearts that so many people use this add-on.
|
||||
|
||||
Any donations to the following methods will be put towards developing Blend_My_NFTs and future related Metaverse/Blockchain projects. This Cozy Studio has big plans for Blend_My_NFTs in 2022 and we value your support!
|
||||
|
||||
- PayPal: https://www.paypal.com/paypalme/TorrinLeonard
|
||||
|
||||
Crypto Addresses:
|
||||
|
||||
- Cardano: `addr1qxzuqz0c32ucga8amwk53unt7vhyf56q73x55aec2lm8esv9cqyl3z4es360mkadfrexhuewgnf5pazdffmns4lk0nqsfylz24`
|
||||
|
||||
- Solana: `A7NuHB79DKfkdZMvqVzBrYN4NXRqP7LVFjMdVoKRfVmo`
|
||||
|
||||
- Ethereum: `0x335408858ce319Cb411090792Ba4BCEE6a2d10CB`
|
||||
|
||||
- USDC (ETH Network): `0x335408858ce319Cb411090792Ba4BCEE6a2d10CB`
|
||||
|
||||
We at This Cozy Studio really appreciate all the support our community has given us, you push us forward and inspire us to accomplish great things. We are nothing without you.
|
||||
|
||||
Thank you,
|
||||
|
||||
- This Cozy Studio team
|
||||
|
||||
## Quick Disclaimer
|
||||
Blend_My_NFTs works with Blender 3.0.0 on Windows 10 or macOS Big Sur 11.6. Linux is supported, however I haven't had the chance to test this functionality and guarantee this. Any rendering engine works; Cycles, Eevee, and Octane have all been used by the community without issue. This add-on only works in Blender, a Cinima 4D port will be investigated in the future.
|
||||
Blend_My_NFTs works with Blender 3.1.0+ on Windows 10 or macOS Big Sur 11.6. Linux is supported, however we haven't had the chance to test and guarantee this functionality. Any rendering engine works; Cycles, Eevee, and Octane have all been used by the community. This add-on only works in Blender, a Cinima 4D port is being investigated.
|
||||
|
||||
## Example Files
|
||||
The YouTube tutorials use three different .blend example files. This repository has all three and includes a readme.md file that outlines which videos use which files and by what date: https://github.com/torrinworx/BMNFTs_Examples
|
||||
|
@ -96,7 +67,7 @@ The YouTube tutorials use three different .blend example files. This repository
|
|||
- [Custom Metadata Fields](#custom-metadata-fields)
|
||||
- [Custom Fields Schema](#custom-fields-schema)
|
||||
- [Logic](#logic)
|
||||
- [Logic JSON Schema](#logic-json-schema)
|
||||
- [Logic JSON File Schema](#logic-json-schema)
|
||||
- [Schema Definition](#schema-definition)
|
||||
- [Rule Types](#rule-types)
|
||||
- [Example Logic.json File](#example-logicjson-file)
|
||||
|
@ -405,63 +376,91 @@ After completeing the `Refactor Batches & Create MetaData` step, you should have
|
|||
|
||||
Congratulations!! You now have a complete 3D NFT collection that is ready to upload to the blockchain of your choice!
|
||||
|
||||
|
||||
# Randomizing Materials
|
||||
To enable the Material Randomizer check the `Enable Materials` check box in the `Create NFT Data`:
|
||||
|
||||
<img width="668" alt="Screen Shot 2022-04-24 at 4 01 41 PM" src="https://user-images.githubusercontent.com/82110564/164994393-ccdade65-df9c-460b-ae3e-0806f9d08692.png">
|
||||
|
||||
Materials are deterimened by a .json file that you manually create. For the purposes of this documentation, just think of JSON as a text file (.txt) that we can use to store information. You can name this file anything, but for this tutorial lets call it `Logic.json`.
|
||||
|
||||
If you need help creating a JSON file, checkout this tutorial: [How to Create JSON File?](https://codebeautify.org/blog/how-to-create-json-file/)
|
||||
|
||||
To learn more about JSON files and how to structure data read this article: [Working with JSON](https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Objects/JSON)
|
||||
|
||||
Material Randomizer compatable materials must follow this naming convention: `<Material Name>_<Material Order Number>_<Material Rarity>`
|
||||
|
||||
## Material Randomizer JSON Schema
|
||||
|
||||
If you'd like, copy and paste this template into the JSON file you created above:
|
||||
|
||||
```
|
||||
{
|
||||
"<Variant Name>": {
|
||||
"Variant Objects":["<Object in Variant Collection 1>", "<Object in Variant Collection 1>"],
|
||||
"Material List": ["<Material Name>_<Material Order Number>_<Material Rarity>", "<Material Name>_<Material Order Number>_<Material Rarity>"]
|
||||
},
|
||||
"Red Cone_1_0": {
|
||||
"Variant Objects":["<Object in Variant Collection 1>", "<Object in Variant Collection 1>"],
|
||||
"Material List": ["<Material Name>_<Material Order Number>_<Material Rarity>", "<Material Name>_<Material Order Number>_<Material Rarity>"]
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
|
||||
# Custom Metadata Fields
|
||||
This section will cover how to implement custom metadata fields. The method is the same for the Cardano CIP-25, Solana, and ERC721 standards.
|
||||
|
||||
These fields are determined by a .json file that you manually create. For the pruposes of this documentation, just think of JSON as a text file (.txt) that we can use to store information. You can name this file anything, but for this tutorial lets call it `Custom_Fields.json`.
|
||||
Custom Metadata fields can be added to the ERC721, CIP 25 and Solana templates. In the `Refactor Batches & Create Metadata` panel check the `Enable Custome Metadata Fields` checkbox:
|
||||
|
||||
If you need help creating a JSON file, checkout this tutorial: [How to Create JSON File?](https://codebeautify.org/blog/how-to-create-json-file/)
|
||||
<img width="671" alt="Screen Shot 2022-04-24 at 3 50 27 PM" src="https://user-images.githubusercontent.com/82110564/164993989-e34f4ecc-37d3-41cf-9bcc-222727a2ab8d.png">
|
||||
|
||||
To learn more about JSON files and how to structure data read this article: [Working with JSON](https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Objects/JSON)
|
||||
You can add and remove Custome Fields in this list using the two + and - buttons: <img width="34" alt="Screen Shot 2022-04-24 at 3 35 16 PM" src="https://user-images.githubusercontent.com/82110564/164993415-71feb5e0-56d1-4143-b423-3e3cde97de30.png">
|
||||
|
||||
## Custom Fields Schema
|
||||
If you'd like, copy and paste this template into the JSON file you created above:
|
||||
Each item in the list has two inputs; the `Name` and the `Value`. Each of these fields will be added to the Metadata Template you select when you refactor your batches.
|
||||
|
||||
```
|
||||
{
|
||||
"<item 1>": "<content of item 1>",
|
||||
"<item 2>": "<content of item 2>",
|
||||
"<item 3>": "<content of item 3>",
|
||||
"<item 4>": "<content of item 4>"
|
||||
...
|
||||
}
|
||||
```
|
||||
**Important:** All Feild `Names` must be unique.
|
||||
|
||||
Each item in this dictionary will be sent to the attributes feild of a given metadata standard. For example, this is what a Cardano template would look like once these fields are applied:
|
||||
|
||||
```
|
||||
{
|
||||
"721": {
|
||||
"<policy_id>": {
|
||||
"Logic Test_1": {
|
||||
"name": "Logic Test_1",
|
||||
"image": "",
|
||||
"mediaType": "",
|
||||
"description": "",
|
||||
"Cube": "Red Cube",
|
||||
"Sphere": "Red Sphere",
|
||||
"<item 1>": "<content of item 1>",
|
||||
"<item 2>": "<content of item 2>",
|
||||
"<item 3>": "<content of item 3>",
|
||||
"<item 4>": "<content of item 4>"
|
||||
}
|
||||
},
|
||||
"version": "1.0"
|
||||
}
|
||||
}
|
||||
```
|
||||
# Logic
|
||||
|
||||
# Logic
|
||||
This section will go over the process of creating and using rules for your NFT collection, we will refer to this process as Logic. To enable Logic, check the `Enable Logic` checkbox in the `Create NFT Data` panel:
|
||||
|
||||
This section will go over the process of creating and using rules for your NFT collection, we will refer to this process as Logic.
|
||||
<img width="438" alt="Screen Shot 2022-04-24 at 3 32 43 PM" src="https://user-images.githubusercontent.com/82110564/164993321-50381d0c-bf95-4046-af9e-bd1a6eeb8bdf.png">
|
||||
|
||||
Logic is deterimened by a .json file that you manually create. For the purposes of this documentation, just think of JSON as a text file (.txt) that we can use to store information. You can name this file anything, but for this tutorial lets call it `Logic.json`.
|
||||
You can add and remove Rules in this list using the two + and - buttons: <img width="34" alt="Screen Shot 2022-04-24 at 3 35 16 PM" src="https://user-images.githubusercontent.com/82110564/164993415-71feb5e0-56d1-4143-b423-3e3cde97de30.png">
|
||||
|
||||
If you need help creating a JSON file, checkout this tutorial: [How to Create JSON File?](https://codebeautify.org/blog/how-to-create-json-file/)
|
||||
Each item in the list will have two text fields labled `Item List 1` and `Item List 2` and a drop down menu where you can select the type of rule:
|
||||
|
||||
To learn more about JSON files and how to structure data read this article: [Working with JSON](https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Objects/JSON)
|
||||
<img width="552" alt="Screen Shot 2022-04-24 at 3 37 03 PM" src="https://user-images.githubusercontent.com/82110564/164993481-335bcc86-1fd1-4a7f-9e82-58c4a4b31642.png">
|
||||
|
||||
The best way to understand how Logic works is to think of it as a sentence, example: ``"Items List 1 Never goes With Items List 2"`` or ``"Items List 1 Only goes With Items List 2"``.
|
||||
|
||||
Both `Item Lists` can contain either multiple Variants or Attributes, however this is limited depending on the type of Rule that is selected (See [Rule Types](#rule-types)). All items in the `Item Lists` are seperated by a comma (`,`), NO SPACES!
|
||||
|
||||
It's reccomended if you have a large NFT collection to create a Logic.json file so that your Rules are stored safely. See [Logic JSON Schema](#logic-json-schema).
|
||||
|
||||
|
||||
## Rule Types
|
||||
|
||||
There are three rule types:
|
||||
- ``Never With`` --> If selected, ``Items List 1`` will never appear if ``Items List 2`` are selected. For each NFT DNA that is generated, either ``Items List 1`` or ``Items List 2`` are randomly selected. That selected ``Items List`` is then acted upon depending on if the items in the list are Attributes or Variants:
|
||||
- If ``Items List`` contains complete Attribute(s), those Attribute(s) will be set to Empty automatically.
|
||||
- If ``Items List`` contains Variant(s), the other Variants in that Variants Attribute will be randomly or weightedly selected depending on if you have ``Enable Rarity`` selected when you create NFT data.
|
||||
|
||||
- ``Only With`` --> If selected, ``Items List 1`` will only appear if ``Items List 2`` are selected. If ``Items List 1`` contains complete Attribute(s), those Attribute(s) will be set to Empty automatically. Meaning they will not appear if you export images, animations, or 3D models.
|
||||
|
||||
- ``Always With`` --> If selected, ``Items List 1`` will always appear if ``Items List 2`` are selected.``Items List 1`` CANNOT contain complete Attribute(s) and is limited to single Variants. The list can contain multiple Variants, however they must be from seperate Attributes.
|
||||
|
||||
**Important:** The more rules you add the higher the chance a rule conflict may arise, and you may see Attribute and Variant behaviour that you do not desire.
|
||||
|
||||
## Logic JSON Schema
|
||||
|
||||
Logic can also be deterimened by a .json file that you manually create. For the purposes of this documentation, just think of JSON as a text file (.txt) that we can use to store information. You can name this file anything, but for this tutorial lets call it `Logic.json`.
|
||||
|
||||
If you need help creating a JSON file, checkout this tutorial: [How to Create JSON File?](https://codebeautify.org/blog/how-to-create-json-file/)
|
||||
|
||||
To learn more about JSON files and how to structure data read this article: [Working with JSON](https://developer.mozilla.org/en-US/docs/Learn/JavaScript/Objects/JSON)
|
||||
If you'd like, copy and paste this template into the JSON file you created above:
|
||||
|
||||
```
|
||||
|
@ -493,20 +492,6 @@ If you'd like, copy and paste this template into the JSON file you created above
|
|||
- ``Rule-Type`` The rule that governs the relation between ``Items-1`` and ``Items-2``. Has two possible values: ``Never with`` and ``Only with``.
|
||||
- ``Items-2`` A list of strings representing the names of Attribute(s) or Variant(s).
|
||||
|
||||
### Rule Types
|
||||
There are two rule types:
|
||||
- ``Never With`` --> If selected, ``Items-1`` will never appear if ``Items-2`` are selected. For each NFT DNA that is generated, either ``Items-1`` or ``Items-2`` are randomly selected. That selected ``Items List`` is then acted upon depending on if the items in the list are Attributes or Variants:
|
||||
- If ``Items List`` contains complete Attribute(s), those Attribute(s) will be set to Empty automatically.
|
||||
- If ``Items List`` contains Variant(s), the other Variants in that Variants Attribute will be randomly or weightedly selected depending on if you have ``Enable Rarity`` selected when you create NFT data.
|
||||
|
||||
- ``Only With`` --> If selected, ``Items-1`` will only appear if ``Items-2`` are selected. If ``Items-1`` contains complete Attribute(s), those Attribute(s) will be set to Empty automatically. Meaning they will not appear if you export images, animations, or 3D models. Items in ``Items-2`` can only be a single Variant, no attributes, and no list of variants.
|
||||
|
||||
- ``Always With`` --> If selected, ``Items-1`` will always appear if ``Items-2`` are selected.``Items-1`` CANNOT contain complete Attribute(s) and is limited to single Variants. The list can contain multiple Variants, however they must be from seperate Attributes.
|
||||
|
||||
The best way to understand how Logic works is to think of it as a sentence, example: ``"Items-1 Never goes with Items-2"`` or ``"Items-1 Only goes with Items-2"``.
|
||||
|
||||
**Important:** The more rules you add the higher the chance a rule conflict may arise, and you may see Attribute and Variant behaviour that you do not desire.
|
||||
|
||||
## Example Logic.json File
|
||||
Say we have the following scene in a .blend file:
|
||||
<img width="420" alt="Screen Shot 2022-03-13 at 4 21 52 PM" src="https://user-images.githubusercontent.com/82110564/158077693-86f961cf-c121-4d0e-8a84-1d6a39e7cafc.png">
|
||||
|
@ -538,7 +523,7 @@ Note that we have two Attributes, ``Cube`` and ``Sphere``, and that they have 4
|
|||
"Items-1": [
|
||||
"Cube"
|
||||
],
|
||||
"Rule-Type": "Never with",
|
||||
"Rule-Type": "Never With",
|
||||
"Items-2":[
|
||||
"Red Sphere_1_25"
|
||||
]
|
||||
|
|
|
@ -0,0 +1,121 @@
|
|||
import bpy
|
||||
|
||||
from bpy.props import (IntProperty,
|
||||
BoolProperty,
|
||||
StringProperty,
|
||||
EnumProperty,
|
||||
CollectionProperty)
|
||||
|
||||
from bpy.types import (Operator,
|
||||
Panel,
|
||||
PropertyGroup,
|
||||
UIList)
|
||||
|
||||
# ======== Operators ======== #
|
||||
class CUSTOM_OT_custom_metadata_fields_actions(Operator):
|
||||
"""Move items up and down, add and remove"""
|
||||
bl_idname = "custom_metadata_fields_uilist.list_action"
|
||||
bl_label = "List Actions"
|
||||
bl_description = "Move items up and down, add and remove"
|
||||
bl_options = {'REGISTER'}
|
||||
|
||||
action: bpy.props.EnumProperty(
|
||||
items=(
|
||||
('UP', "Up", ""),
|
||||
('DOWN', "Down", ""),
|
||||
('REMOVE', "Remove", ""),
|
||||
('ADD', "Add", "")))
|
||||
|
||||
def invoke(self, context, event):
|
||||
scn = context.scene
|
||||
idx = scn.custom_metadata_fields_index
|
||||
|
||||
try:
|
||||
item = scn.custom_metadata_fields[idx]
|
||||
except IndexError:
|
||||
pass
|
||||
else:
|
||||
if self.action == 'DOWN' and idx < len(scn.custom_metadata_fields) - 1:
|
||||
item_next = scn.custom_metadata_fields[idx + 1].name
|
||||
scn.custom_metadata_fields.move(idx, idx + 1)
|
||||
scn.custom_metadata_fields_index += 1
|
||||
info = 'Item "%s" moved to position %d' % (item.name, scn.custom_metadata_fields_index + 1)
|
||||
self.report({'INFO'}, info)
|
||||
|
||||
elif self.action == 'UP' and idx >= 1:
|
||||
item_prev = scn.custom_metadata_fields[idx - 1].name
|
||||
scn.custom_metadata_fields.move(idx, idx - 1)
|
||||
scn.custom_metadata_fields_index -= 1
|
||||
info = 'Item "%s" moved to position %d' % (item.name, scn.custom_metadata_fields_index + 1)
|
||||
self.report({'INFO'}, info)
|
||||
|
||||
elif self.action == 'REMOVE':
|
||||
info = 'Item "%s" removed from list' % (scn.custom_metadata_fields[idx].name)
|
||||
scn.custom_metadata_fields_index -= 1
|
||||
scn.custom_metadata_fields.remove(idx)
|
||||
self.report({'INFO'}, info)
|
||||
|
||||
if self.action == 'ADD':
|
||||
if context.object:
|
||||
item = scn.custom_metadata_fields.add()
|
||||
item.name = "Custom Metadata Field" # The name of each object
|
||||
scn.custom_metadata_fields_index = len(scn.custom_metadata_fields) - 1
|
||||
info = '"%s" added to list' % (item.name)
|
||||
self.report({'INFO'}, info)
|
||||
else:
|
||||
self.report({'INFO'}, "Nothing selected in the Viewport")
|
||||
return {"FINISHED"}
|
||||
|
||||
|
||||
class CUSTOM_OT_custom_metadata_fields_clearList(Operator):
|
||||
"""Clear all items of the list"""
|
||||
bl_idname = "custom_metadata_fields_uilist.clear_list"
|
||||
bl_label = "Clear Custom Fields"
|
||||
bl_description = "Clear all items of the list"
|
||||
bl_options = {'INTERNAL'}
|
||||
|
||||
@classmethod
|
||||
def poll(cls, context):
|
||||
return bool(context.scene.custom_metadata_fields)
|
||||
|
||||
def invoke(self, context, event):
|
||||
return context.window_manager.invoke_confirm(self, event)
|
||||
|
||||
def execute(self, context):
|
||||
if bool(context.scene.custom_metadata_fields):
|
||||
context.scene.custom_metadata_fields.clear()
|
||||
self.report({'INFO'}, "All items removed")
|
||||
else:
|
||||
self.report({'INFO'}, "Nothing to remove")
|
||||
return {'FINISHED'}
|
||||
|
||||
|
||||
# ======== UILists ======== #
|
||||
class CUSTOM_UL_custom_metadata_fields_items(UIList):
|
||||
def draw_item(self, context, layout, data, item, icon, active_data, active_propname, index):
|
||||
split = layout.split(factor=0.1)
|
||||
split.label(text=f"{index + 1}")
|
||||
row = split.row()
|
||||
row.label(text=item.name) # avoids renaming the item by accident
|
||||
row.prop(item, "field_name", text="")
|
||||
row.prop(item, "field_value", text="")
|
||||
|
||||
def invoke(self, context, event):
|
||||
pass
|
||||
|
||||
# ======== Property Collection ======== #
|
||||
class CUSTOM_custom_metadata_fields_objectCollection(PropertyGroup):
|
||||
# name: StringProperty() -> Instantiated by default
|
||||
obj_type: StringProperty()
|
||||
obj_id: IntProperty()
|
||||
field_name: StringProperty(default="Name")
|
||||
field_value: StringProperty(default="Value")
|
||||
|
||||
|
||||
# ======== Register/Unregister Classes (Passed to __init__.py for operation) ======== #
|
||||
classes_Custom_Metadata_UIList = (
|
||||
CUSTOM_OT_custom_metadata_fields_actions,
|
||||
CUSTOM_OT_custom_metadata_fields_clearList,
|
||||
CUSTOM_UL_custom_metadata_fields_items,
|
||||
CUSTOM_custom_metadata_fields_objectCollection,
|
||||
)
|
|
@ -0,0 +1,135 @@
|
|||
import bpy
|
||||
|
||||
from bpy.props import (IntProperty,
|
||||
BoolProperty,
|
||||
StringProperty,
|
||||
EnumProperty,
|
||||
CollectionProperty)
|
||||
|
||||
from bpy.types import (Operator,
|
||||
Panel,
|
||||
PropertyGroup,
|
||||
UIList)
|
||||
|
||||
# ======== Operators ======== #
|
||||
class CUSTOM_OT_logic_actions(Operator):
|
||||
"""Move items up and down, add and remove"""
|
||||
bl_idname = "logic_uilist.logic_list_action"
|
||||
bl_label = "List Actions"
|
||||
bl_description = "Move items up and down, add and remove"
|
||||
bl_options = {'REGISTER'}
|
||||
|
||||
action: bpy.props.EnumProperty(
|
||||
items=(
|
||||
('UP', "Up", ""),
|
||||
('DOWN', "Down", ""),
|
||||
('REMOVE', "Remove", ""),
|
||||
('ADD', "Add", "")))
|
||||
|
||||
def invoke(self, context, event):
|
||||
scn = context.scene
|
||||
idx = scn.logic_fields_index
|
||||
|
||||
try:
|
||||
item = scn.logic_fields[idx]
|
||||
except IndexError:
|
||||
pass
|
||||
else:
|
||||
if self.action == 'DOWN' and idx < len(scn.logic_fields) - 1:
|
||||
item_next = scn.logic_fields[idx + 1].name
|
||||
scn.logic_fields.move(idx, idx + 1)
|
||||
scn.logic_fields_index += 1
|
||||
info = 'Item "%s" moved to position %d' % (item.name, scn.logic_fields_index + 1)
|
||||
self.report({'INFO'}, info)
|
||||
|
||||
elif self.action == 'UP' and idx >= 1:
|
||||
item_prev = scn.logic_fields[idx - 1].name
|
||||
scn.logic_fields.move(idx, idx - 1)
|
||||
scn.logic_fields_index -= 1
|
||||
info = 'Item "%s" moved to position %d' % (item.name, scn.logic_fields_index + 1)
|
||||
self.report({'INFO'}, info)
|
||||
|
||||
elif self.action == 'REMOVE':
|
||||
info = 'Item "%s" removed from list' % (scn.logic_fields[idx].name)
|
||||
scn.logic_fields_index -= 1
|
||||
scn.logic_fields.remove(idx)
|
||||
self.report({'INFO'}, info)
|
||||
|
||||
if self.action == 'ADD':
|
||||
if context.object:
|
||||
item = scn.logic_fields.add()
|
||||
item.name = "Rule" # The name of each object
|
||||
scn.logic_fields_index = len(scn.logic_fields) - 1
|
||||
info = '"%s" added to list' % (item.name)
|
||||
self.report({'INFO'}, info)
|
||||
else:
|
||||
self.report({'INFO'}, "Nothing selected in the Viewport")
|
||||
return {"FINISHED"}
|
||||
|
||||
|
||||
class CUSTOM_OT_logic_clearList(Operator):
|
||||
"""Clear all items of the list"""
|
||||
bl_idname = "logic_uilist.logic_clear_list"
|
||||
bl_label = "Clear Logic Rules"
|
||||
bl_description = "Clear all items of the list"
|
||||
bl_options = {'INTERNAL'}
|
||||
|
||||
@classmethod
|
||||
def poll(cls, context):
|
||||
return bool(context.scene.logic_fields)
|
||||
|
||||
def invoke(self, context, event):
|
||||
return context.window_manager.invoke_confirm(self, event)
|
||||
|
||||
def execute(self, context):
|
||||
if bool(context.scene.logic_fields):
|
||||
context.scene.logic_fields.clear()
|
||||
self.report({'INFO'}, "All items removed")
|
||||
else:
|
||||
self.report({'INFO'}, "Nothing to remove")
|
||||
return {'FINISHED'}
|
||||
|
||||
|
||||
# ======== UILists ======== #
|
||||
class CUSTOM_UL_logic_items(UIList):
|
||||
def draw_item(self, context, layout, data, item, icon, active_data, active_propname, index):
|
||||
split = layout.split(factor=0.1)
|
||||
split.label(text=f"{index + 1}")
|
||||
row = split.row()
|
||||
row.label(text=item.name) # avoids renaming the item by accident
|
||||
row.prop(item, "item_list1", text="")
|
||||
|
||||
row.prop(item, "rule_type", text="")
|
||||
row.prop(item, "item_list2", text="")
|
||||
|
||||
def invoke(self, context, event):
|
||||
pass
|
||||
|
||||
|
||||
# ======== Property Collection ======== #
|
||||
class CUSTOM_logic_objectCollection(PropertyGroup):
|
||||
# name: StringProperty() -> Instantiated by default
|
||||
obj_type: StringProperty()
|
||||
obj_id: IntProperty()
|
||||
|
||||
item_list1: StringProperty(default="Item List 1")
|
||||
rule_type: EnumProperty(
|
||||
name="Rule Type",
|
||||
description="Select the Rule Type",
|
||||
items=[
|
||||
('Never With', "Never With", ""),
|
||||
('Only With', "Only With", ""),
|
||||
('Always With', "Always With", ""),
|
||||
|
||||
]
|
||||
)
|
||||
item_list2: StringProperty(default="Item List 2")
|
||||
|
||||
|
||||
# ======== Register/Unregister Classes (Passed to __init__.py for operation) ======== #
|
||||
classes_Logic_UIList = (
|
||||
CUSTOM_OT_logic_actions,
|
||||
CUSTOM_OT_logic_clearList,
|
||||
CUSTOM_UL_logic_items,
|
||||
CUSTOM_logic_objectCollection,
|
||||
)
|
553
__init__.py
553
__init__.py
|
@ -1,18 +1,23 @@
|
|||
bl_info = {
|
||||
"name": "Blend_My_NFTs",
|
||||
"name": "Blend_My_NFTs V4 Alpha",
|
||||
"author": "Torrin Leonard, This Cozy Studio Inc",
|
||||
"version": (3, 2, 0),
|
||||
"blender": (3, 1, 3),
|
||||
"version": (4, 0, 2),
|
||||
"blender": (3, 2, 0),
|
||||
"location": "View3D",
|
||||
"description": "An open source, free to use Blender add-on that enables you to create thousands of unique images, animations, and 3D models.",
|
||||
"category": "Development",
|
||||
}
|
||||
|
||||
BMNFTS_VERSION = "v4.0.2 - Alpha"
|
||||
LAST_UPDATED = "8:05AM, May 31st, 2022"
|
||||
|
||||
# ======== Import handling ======== #
|
||||
|
||||
import bpy
|
||||
from bpy.app.handlers import persistent
|
||||
from bpy.props import (IntProperty,
|
||||
BoolProperty,
|
||||
CollectionProperty)
|
||||
|
||||
import os
|
||||
import sys
|
||||
|
@ -22,24 +27,43 @@ import importlib
|
|||
# "a little hacky bs" - Matthew TheBrochacho ;)
|
||||
sys.path.append(os.path.dirname(os.path.realpath(__file__)))
|
||||
|
||||
if bpy in locals():
|
||||
importlib.reload(DNA_Generator)
|
||||
importlib.reload(Batch_Sorter)
|
||||
importlib.reload(Exporter)
|
||||
importlib.reload(Refactorer)
|
||||
importlib.reload(get_combinations)
|
||||
importlib.reload(Checks)
|
||||
importlib.reload(HeadlessUtil)
|
||||
else:
|
||||
from main import \
|
||||
DNA_Generator, \
|
||||
Batch_Sorter, \
|
||||
Exporter, \
|
||||
Refactorer, \
|
||||
get_combinations, \
|
||||
Checks, \
|
||||
HeadlessUtil
|
||||
from main import \
|
||||
Checks, \
|
||||
DNA_Generator, \
|
||||
Exporter, \
|
||||
get_combinations, \
|
||||
HeadlessUtil, \
|
||||
loading_animation, \
|
||||
Logic, \
|
||||
Material_Generator, \
|
||||
Metadata, \
|
||||
Rarity, \
|
||||
Refactorer
|
||||
|
||||
from UILists import \
|
||||
Custom_Metadata_UIList, \
|
||||
Logic_UIList
|
||||
|
||||
if "bpy" in locals():
|
||||
modules = {
|
||||
"Checks": Checks,
|
||||
"DNA_Generator": DNA_Generator,
|
||||
"Exporter": Exporter,
|
||||
"get_combinations": get_combinations,
|
||||
"HeadlessUtil": HeadlessUtil,
|
||||
"loading_animation": loading_animation,
|
||||
"Logic": Logic,
|
||||
"Material_Generator": Material_Generator,
|
||||
"Metadata": Metadata,
|
||||
"Rarity": Rarity,
|
||||
"Refactorer": Refactorer,
|
||||
"Custom_Metadata_UIList": Custom_Metadata_UIList,
|
||||
"Logic_UIList": Logic_UIList,
|
||||
}
|
||||
|
||||
for i in modules:
|
||||
if i in locals():
|
||||
importlib.reload(modules[i])
|
||||
|
||||
# ======== Persistant UI Refresh ======== #
|
||||
|
||||
|
@ -47,6 +71,7 @@ else:
|
|||
combinations: int = 0
|
||||
recommended_limit: int = 0
|
||||
|
||||
|
||||
@persistent
|
||||
def Refresh_UI(dummy1, dummy2):
|
||||
"""
|
||||
|
@ -57,7 +82,7 @@ def Refresh_UI(dummy1, dummy2):
|
|||
global recommended_limit
|
||||
|
||||
combinations = (get_combinations.get_combinations())
|
||||
recommended_limit = int(round(combinations/2))
|
||||
recommended_limit = int(round(combinations / 2))
|
||||
|
||||
# Add panel classes that require refresh to this refresh_panels tuple:
|
||||
refresh_panel_classes = (
|
||||
|
@ -74,6 +99,7 @@ def Refresh_UI(dummy1, dummy2):
|
|||
|
||||
redraw_panel(refresh_panel_classes)
|
||||
|
||||
|
||||
bpy.app.handlers.depsgraph_update_post.append(Refresh_UI)
|
||||
|
||||
|
||||
|
@ -93,10 +119,12 @@ def make_directories(save_path):
|
|||
os.makedirs(nftBatch_save_path)
|
||||
return Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path
|
||||
|
||||
|
||||
def runAsHeadless():
|
||||
"""
|
||||
For use when running from the command line.
|
||||
"""
|
||||
|
||||
def dumpSettings(settings):
|
||||
output = (
|
||||
f"nftName={settings.nftName}\n"
|
||||
|
@ -120,6 +148,8 @@ def runAsHeadless():
|
|||
f"solana_description={settings.solana_description}\n"
|
||||
f"enableCustomFields={str(settings.enableCustomFields)}\n"
|
||||
f"customfieldsFile={settings.customfieldsFile}\n"
|
||||
f"enableMaterials={str(settings.customfieldsFile)}\n"
|
||||
f"materialsFile={settings.materialsFile}\n"
|
||||
)
|
||||
print(output)
|
||||
|
||||
|
@ -157,6 +187,8 @@ def runAsHeadless():
|
|||
settings.solanaDescription = pairs[18][1]
|
||||
settings.enableCustomFields = pairs[19][1] == 'True'
|
||||
settings.customfieldsFile = pairs[20][1]
|
||||
settings.enableMaterials = pairs[21][1] == 'True'
|
||||
settings.materialsFile = pairs[22][1]
|
||||
|
||||
if args.save_path:
|
||||
settings.save_path = args.save_path
|
||||
|
@ -169,7 +201,7 @@ def runAsHeadless():
|
|||
# don't mind me, just copy-pasting code around...
|
||||
if args.operation == 'create-dna':
|
||||
nftName = settings.nftName
|
||||
maxNFTs = settings.collectionSize
|
||||
collectionSize = settings.collectionSize
|
||||
nftsPerBatch = settings.nftsPerBatch
|
||||
save_path = bpy.path.abspath(settings.save_path)
|
||||
logicFile = bpy.path.abspath(settings.logicFile)
|
||||
|
@ -177,17 +209,19 @@ def runAsHeadless():
|
|||
enableRarity = settings.enableRarity
|
||||
enableLogic = settings.enableLogic
|
||||
|
||||
enableMaterials = settings.enableMaterials
|
||||
materialsFile = settings.materialsFile
|
||||
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
|
||||
DNA_Generator.send_To_Record_JSON(maxNFTs, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile,
|
||||
Blend_My_NFTs_Output)
|
||||
Batch_Sorter.makeBatches(nftName, maxNFTs, nftsPerBatch, save_path, batch_json_save_path)
|
||||
DNA_Generator.send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, enableMaterials,
|
||||
materialsFile, Blend_My_NFTs_Output, batch_json_save_path)
|
||||
|
||||
elif args.operation == 'generate-nfts':
|
||||
nftName = settings.nftName
|
||||
save_path = bpy.path.abspath(settings.save_path)
|
||||
batchToGenerate = settings.batchToGenerate
|
||||
maxNFTs = settings.collectionSize
|
||||
collectionSize = settings.collectionSize
|
||||
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
|
||||
|
@ -203,16 +237,20 @@ def runAsHeadless():
|
|||
enableModelsBlender = settings.modelBool
|
||||
modelFileFormat = settings.modelEnum
|
||||
|
||||
enableMaterials = settings.enableMaterials
|
||||
materialsFile = settings.materialsFile
|
||||
|
||||
# fail state variables, set to no fail due to resume_failed_batch() Operator in BMNFTS_PT_GenerateNFTs Panel
|
||||
fail_state = False
|
||||
failed_batch = None
|
||||
failed_dna = None
|
||||
failed_dna_index = None
|
||||
|
||||
Exporter.render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path, nftBatch_save_path,
|
||||
enableImages,
|
||||
Exporter.render_and_save_NFTs(nftName, collectionSize, batchToGenerate, batch_json_save_path,
|
||||
nftBatch_save_path, enableImages,
|
||||
imageFileFormat, enableAnimations, animationFileFormat, enableModelsBlender,
|
||||
modelFileFormat, fail_state, failed_batch, failed_dna, failed_dna_index
|
||||
modelFileFormat, fail_state, failed_batch, failed_dna, failed_dna_index,
|
||||
enableMaterials, materialsFile
|
||||
)
|
||||
elif args.operation == 'refactor-batches':
|
||||
class refactorData:
|
||||
|
@ -236,7 +274,6 @@ def runAsHeadless():
|
|||
|
||||
# ======== User input Property Group ======== #
|
||||
class BMNFTS_PGT_Input_Properties(bpy.types.PropertyGroup):
|
||||
|
||||
# Create NFT Data Panel:
|
||||
|
||||
nftName: bpy.props.StringProperty(name="NFT Name")
|
||||
|
@ -245,29 +282,39 @@ class BMNFTS_PGT_Input_Properties(bpy.types.PropertyGroup):
|
|||
nftsPerBatch: bpy.props.IntProperty(name="NFTs Per Batch", default=1, min=1) # max=(combinations - offset)
|
||||
|
||||
save_path: bpy.props.StringProperty(
|
||||
name="Save Path",
|
||||
description="Save path for NFT files",
|
||||
default="",
|
||||
maxlen=1024,
|
||||
subtype="DIR_PATH"
|
||||
name="Save Path",
|
||||
description="Save path for NFT files",
|
||||
default="",
|
||||
maxlen=1024,
|
||||
subtype="DIR_PATH"
|
||||
)
|
||||
|
||||
enableRarity: bpy.props.BoolProperty(name="Enable Rarity")
|
||||
|
||||
enableLogic: bpy.props.BoolProperty(name="Enable Logic")
|
||||
enable_Logic_Json: bpy.props.BoolProperty(name="Use Logic.json instead")
|
||||
logicFile: bpy.props.StringProperty(
|
||||
name="Logic File",
|
||||
description="Path where Logic.json is located.",
|
||||
default="",
|
||||
maxlen=1024,
|
||||
subtype="FILE_PATH"
|
||||
name="Logic File Path",
|
||||
description="Path where Logic.json is located.",
|
||||
default="",
|
||||
maxlen=1024,
|
||||
subtype="FILE_PATH"
|
||||
)
|
||||
|
||||
enableMaterials: bpy.props.BoolProperty(name="Enable Materials")
|
||||
materialsFile: bpy.props.StringProperty(
|
||||
name="Materials File",
|
||||
description="Path where Materials.json is located.",
|
||||
default="",
|
||||
maxlen=1024,
|
||||
subtype="FILE_PATH"
|
||||
)
|
||||
|
||||
# Generate NFTs Panel:
|
||||
imageBool: bpy.props.BoolProperty(name="Image")
|
||||
imageEnum: bpy.props.EnumProperty(
|
||||
name="Image File Format",
|
||||
description="Select Image file format",
|
||||
name="Image File Format",
|
||||
description="Select Image file format",
|
||||
items=[
|
||||
('PNG', ".PNG", "Export NFT as PNG"),
|
||||
('JPEG', ".JPEG", "Export NFT as JPEG")
|
||||
|
@ -276,33 +323,38 @@ class BMNFTS_PGT_Input_Properties(bpy.types.PropertyGroup):
|
|||
|
||||
animationBool: bpy.props.BoolProperty(name="Animation")
|
||||
animationEnum: bpy.props.EnumProperty(
|
||||
name="Animation File Format",
|
||||
description="Select Animation file format",
|
||||
name="Animation File Format",
|
||||
description="Select Animation file format",
|
||||
items=[
|
||||
('AVI_JPEG', '.avi (AVI_JPEG)', 'Export NFT as AVI_JPEG'),
|
||||
('AVI_RAW', '.avi (AVI_RAW)', 'Export NFT as AVI_RAW'),
|
||||
('FFMPEG', '.mkv (FFMPEG)', 'Export NFT as FFMPEG'),
|
||||
('MP4', '.mp4', 'Export NFT as .mp4')
|
||||
('MP4', '.mp4', 'Export NFT as .mp4'),
|
||||
('PNG', '.png', 'Export NFT as PNG'),
|
||||
('TIFF', '.tiff', 'Export NFT as TIFF')
|
||||
]
|
||||
)
|
||||
|
||||
modelBool: bpy.props.BoolProperty(name="3D Model")
|
||||
modelEnum: bpy.props.EnumProperty(
|
||||
name="3D Model File Format",
|
||||
description="Select 3D Model file format",
|
||||
name="3D Model File Format",
|
||||
description="Select 3D Model file format",
|
||||
items=[
|
||||
('GLB', '.glb', 'Export NFT as .glb'),
|
||||
('GLTF_SEPARATE', '.gltf + .bin + textures', 'Export NFT as .gltf with separated textures in .bin + textures.'),
|
||||
('GLTF_SEPARATE', '.gltf + .bin + textures',
|
||||
'Export NFT as .gltf with separated textures in .bin + textures.'),
|
||||
('GLTF_EMBEDDED', '.gltf', 'Export NFT as embedded .gltf file that contains textures.'),
|
||||
('FBX', '.fbx', 'Export NFT as .fbx'),
|
||||
('OBJ', '.obj', 'Export NFT as .obj'),
|
||||
('X3D', '.x3d', 'Export NFT as .x3d'),
|
||||
('STL', '.stl', 'Export NFT as .stl'),
|
||||
('VOX', '.vox (Experimental)', 'Export NFT as .vox, requires the voxwriter add on: https://github.com/Spyduck/voxwriter')
|
||||
('VOX', '.vox (Experimental)',
|
||||
'Export NFT as .vox, requires the voxwriter add on: https://github.com/Spyduck/voxwriter')
|
||||
]
|
||||
)
|
||||
|
||||
batchToGenerate: bpy.props.IntProperty(name="Batch To Generate", default=1, min=1) # max=(collectionSize / nftsPerBatch)
|
||||
batchToGenerate: bpy.props.IntProperty(name="Batch To Generate", default=1,
|
||||
min=1)
|
||||
|
||||
# Refactor Batches & Create Metadata Panel:
|
||||
cardanoMetaDataBool: bpy.props.BoolProperty(name="Cardano Cip")
|
||||
|
@ -316,11 +368,11 @@ class BMNFTS_PGT_Input_Properties(bpy.types.PropertyGroup):
|
|||
|
||||
enableCustomFields: bpy.props.BoolProperty(name="Enable Custom Metadata Fields")
|
||||
customfieldsFile: bpy.props.StringProperty(
|
||||
name="Custom Fields File",
|
||||
description="Path where Custom_Fields.json is located.",
|
||||
default="",
|
||||
maxlen=1024,
|
||||
subtype="FILE_PATH"
|
||||
name="Custom Fields File",
|
||||
description="Path where Custom_Fields.json is located.",
|
||||
default="",
|
||||
maxlen=1024,
|
||||
subtype="FILE_PATH"
|
||||
)
|
||||
|
||||
# Other Panel:
|
||||
|
@ -333,67 +385,160 @@ class BMNFTS_PGT_Input_Properties(bpy.types.PropertyGroup):
|
|||
class createData(bpy.types.Operator):
|
||||
bl_idname = 'create.data'
|
||||
bl_label = 'Create Data'
|
||||
bl_description = 'Creates NFT Data. Run after any changes were made to scene.'
|
||||
bl_description = 'Creates NFT Data. Run after any changes were made to scene. All previous data will be overwritten and cannot be recovered.'
|
||||
bl_options = {"REGISTER", "UNDO"}
|
||||
|
||||
def execute(self, context):
|
||||
reverse_order: BoolProperty(
|
||||
default=False,
|
||||
name="Reverse Order")
|
||||
|
||||
def execute(self, context):
|
||||
nftName = bpy.context.scene.input_tool.nftName
|
||||
collectionSize = bpy.context.scene.input_tool.collectionSize
|
||||
nftsPerBatch = bpy.context.scene.input_tool.nftsPerBatch
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
logicFile = bpy.path.abspath(bpy.context.scene.input_tool.logicFile)
|
||||
|
||||
enableRarity = bpy.context.scene.input_tool.enableRarity
|
||||
|
||||
enableLogic = bpy.context.scene.input_tool.enableLogic
|
||||
enable_Logic_Json = bpy.context.scene.input_tool.enable_Logic_Json
|
||||
logicFile = bpy.path.abspath(bpy.context.scene.input_tool.logicFile)
|
||||
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
enableMaterials = bpy.context.scene.input_tool.enableMaterials
|
||||
materialsFile = bpy.path.abspath(bpy.context.scene.input_tool.materialsFile)
|
||||
|
||||
DNA_Generator.send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, Blend_My_NFTs_Output)
|
||||
Batch_Sorter.makeBatches(nftName, collectionSize, nftsPerBatch, save_path, batch_json_save_path)
|
||||
# Handling Custom Fields UIList input:
|
||||
if enableLogic:
|
||||
if enable_Logic_Json and logicFile:
|
||||
logicFile = json.load(open(logicFile))
|
||||
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
DNA_Generator.send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, enableMaterials,
|
||||
materialsFile, Blend_My_NFTs_Output, batch_json_save_path)
|
||||
|
||||
if enable_Logic_Json and not logicFile:
|
||||
self.report({'ERROR'}, f"No Logic.json file path set. Please set the file path to your Logic.json file.")
|
||||
|
||||
if not enable_Logic_Json:
|
||||
scn = context.scene
|
||||
if self.reverse_order:
|
||||
logicFile = {}
|
||||
num = 1
|
||||
for i in range(scn.logic_fields_index, -1, -1):
|
||||
item = scn.logic_fields[i]
|
||||
|
||||
item_list1 = item.item_list1
|
||||
rule_type = item.rule_type
|
||||
item_list2 = item.item_list2
|
||||
logicFile[f"Rule-{num}"] = {
|
||||
"Items-1": item_list1.split(','),
|
||||
"Rule-Type": rule_type,
|
||||
"Items-2": item_list2.split(',')
|
||||
}
|
||||
num += 1
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
DNA_Generator.send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, enableMaterials,
|
||||
materialsFile, Blend_My_NFTs_Output, batch_json_save_path)
|
||||
else:
|
||||
logicFile = {}
|
||||
num = 1
|
||||
for item in scn.logic_fields:
|
||||
item_list1 = item.item_list1
|
||||
rule_type = item.rule_type
|
||||
item_list2 = item.item_list2
|
||||
logicFile[f"Rule-{num}"] = {
|
||||
"Items-1": item_list1.split(','),
|
||||
"Rule-Type": rule_type,
|
||||
"Items-2": item_list2.split(',')
|
||||
}
|
||||
num += 1
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
DNA_Generator.send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, enableMaterials,
|
||||
materialsFile, Blend_My_NFTs_Output, batch_json_save_path)
|
||||
|
||||
if not enableLogic:
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
DNA_Generator.send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, enableMaterials,
|
||||
materialsFile, Blend_My_NFTs_Output, batch_json_save_path)
|
||||
self.report({'INFO'}, f"NFT Data created!")
|
||||
|
||||
return {"FINISHED"}
|
||||
|
||||
def invoke(self, context, event):
|
||||
return context.window_manager.invoke_confirm(self, event)
|
||||
|
||||
|
||||
class exportNFTs(bpy.types.Operator):
|
||||
bl_idname = 'exporter.nfts'
|
||||
bl_label = 'Export NFTs'
|
||||
bl_description = 'Generate and export a given batch of NFTs.'
|
||||
bl_options = {"REGISTER", "UNDO"}
|
||||
|
||||
reverse_order: BoolProperty(
|
||||
default=False,
|
||||
name="Reverse Order")
|
||||
|
||||
def execute(self, context):
|
||||
nftName = bpy.context.scene.input_tool.nftName
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
batchToGenerate = bpy.context.scene.input_tool.batchToGenerate
|
||||
collectionSize = bpy.context.scene.input_tool.collectionSize
|
||||
class input:
|
||||
nftName = bpy.context.scene.input_tool.nftName
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
batchToGenerate = bpy.context.scene.input_tool.batchToGenerate
|
||||
collectionSize = bpy.context.scene.input_tool.collectionSize
|
||||
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
|
||||
enableImages = bpy.context.scene.input_tool.imageBool
|
||||
imageFileFormat = bpy.context.scene.input_tool.imageEnum
|
||||
enableImages = bpy.context.scene.input_tool.imageBool
|
||||
imageFileFormat = bpy.context.scene.input_tool.imageEnum
|
||||
|
||||
enableAnimations = bpy.context.scene.input_tool.animationBool
|
||||
animationFileFormat = bpy.context.scene.input_tool.animationEnum
|
||||
enableAnimations = bpy.context.scene.input_tool.animationBool
|
||||
animationFileFormat = bpy.context.scene.input_tool.animationEnum
|
||||
|
||||
enableModelsBlender = bpy.context.scene.input_tool.modelBool
|
||||
modelFileFormat = bpy.context.scene.input_tool.modelEnum
|
||||
enableModelsBlender = bpy.context.scene.input_tool.modelBool
|
||||
modelFileFormat = bpy.context.scene.input_tool.modelEnum
|
||||
|
||||
# fail state variables, set to no fail due to resume_failed_batch() Operator in BMNFTS_PT_GenerateNFTs Panel
|
||||
fail_state = False
|
||||
failed_batch = None
|
||||
failed_dna = None
|
||||
failed_dna_index = None
|
||||
enableCustomFields = bpy.context.scene.input_tool.enableCustomFields
|
||||
custom_Fields = {}
|
||||
|
||||
Exporter.render_and_save_NFTs(nftName, collectionSize, batchToGenerate, batch_json_save_path, nftBatch_save_path, enableImages,
|
||||
imageFileFormat, enableAnimations, animationFileFormat, enableModelsBlender,
|
||||
modelFileFormat, fail_state, failed_batch, failed_dna, failed_dna_index
|
||||
)
|
||||
cardanoMetaDataBool = bpy.context.scene.input_tool.cardanoMetaDataBool
|
||||
solanaMetaDataBool = bpy.context.scene.input_tool.solanaMetaDataBool
|
||||
erc721MetaData = bpy.context.scene.input_tool.erc721MetaData
|
||||
|
||||
self.report({'INFO'}, f"All NFTs generated for batch {batchToGenerate}!")
|
||||
cardano_description = bpy.context.scene.input_tool.cardano_description
|
||||
solana_description = bpy.context.scene.input_tool.solana_description
|
||||
erc721_description = bpy.context.scene.input_tool.erc721_description
|
||||
|
||||
enableMaterials = bpy.context.scene.input_tool.enableMaterials
|
||||
materialsFile = bpy.path.abspath(bpy.context.scene.input_tool.materialsFile)
|
||||
|
||||
# fail state variables, set to no fail due to resume_failed_batch() Operator in BMNFTS_PT_GenerateNFTs Panel
|
||||
fail_state = False
|
||||
failed_batch = None
|
||||
failed_dna = None
|
||||
failed_dna_index = None
|
||||
|
||||
# Handling Custom Fields UIList input:
|
||||
if input.enableCustomFields:
|
||||
scn = context.scene
|
||||
if self.reverse_order:
|
||||
for i in range(scn.custom_metadata_fields_index, -1, -1):
|
||||
item = scn.custom_metadata_fields[i]
|
||||
if item.field_name in list(input.custom_Fields.keys()):
|
||||
raise ValueError(f"A duplicate of '{item.field_name}' was found. Please ensure all Custom Metadata field Names are unique.")
|
||||
else:
|
||||
input.custom_Fields[item.field_name] = item.field_value
|
||||
else:
|
||||
for item in scn.custom_metadata_fields:
|
||||
if item.field_name in list(input.custom_Fields.keys()):
|
||||
raise ValueError(f"A duplicate of '{item.field_name}' was found. Please ensure all Custom Metadata field Names are unique.")
|
||||
else:
|
||||
input.custom_Fields[item.field_name] = item.field_value
|
||||
|
||||
Exporter.render_and_save_NFTs(input)
|
||||
|
||||
self.report({'INFO'}, f"All NFTs generated for batch {input.batchToGenerate}!")
|
||||
|
||||
return {"FINISHED"}
|
||||
|
||||
|
||||
class resume_failed_batch(bpy.types.Operator):
|
||||
bl_idname = 'exporter.resume_nfts'
|
||||
bl_label = 'Resume Failed Batch'
|
||||
|
@ -401,34 +546,50 @@ class resume_failed_batch(bpy.types.Operator):
|
|||
bl_options = {"REGISTER", "UNDO"}
|
||||
|
||||
def execute(self, context):
|
||||
nftName = bpy.context.scene.input_tool.nftName
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
batchToGenerate = bpy.context.scene.input_tool.batchToGenerate
|
||||
collectionSize = bpy.context.scene.input_tool.collectionSize
|
||||
class input:
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
batchToGenerate = bpy.context.scene.input_tool.batchToGenerate
|
||||
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
fail_state, failed_batch, failed_dna, failed_dna_index = Checks.check_FailedBatches(batch_json_save_path)
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
file_name = os.path.join(batch_json_save_path, "Batch{}.json".format(batchToGenerate))
|
||||
batch = json.load(open(file_name))
|
||||
|
||||
file_name = os.path.join(batch_json_save_path, "Batch{}.json".format(batchToGenerate))
|
||||
batch = json.load(open(file_name))
|
||||
nftName = batch["Generation Save"][-1]["Render_Settings"]["nftName"]
|
||||
collectionSize = batch["Generation Save"][-1]["Render_Settings"]["collectionSize"]
|
||||
nftBatch_save_path = batch["Generation Save"][-1]["Render_Settings"]["nftBatch_save_path"]
|
||||
|
||||
nftBatch_save_path = batch["Generation Save"][-1]["Render_Settings"]["nftBatch_save_path"]
|
||||
enableImages = batch["Generation Save"][-1]["Render_Settings"]["enableImages"]
|
||||
imageFileFormat = batch["Generation Save"][-1]["Render_Settings"]["imageFileFormat"]
|
||||
enableAnimations = batch["Generation Save"][-1]["Render_Settings"]["enableAnimations"]
|
||||
animationFileFormat = batch["Generation Save"][-1]["Render_Settings"]["animationFileFormat"]
|
||||
enableModelsBlender = batch["Generation Save"][-1]["Render_Settings"]["enableModelsBlender"]
|
||||
modelFileFormat = batch["Generation Save"][-1]["Render_Settings"]["modelFileFormat"]
|
||||
enableImages = batch["Generation Save"][-1]["Render_Settings"]["enableImages"]
|
||||
imageFileFormat = batch["Generation Save"][-1]["Render_Settings"]["imageFileFormat"]
|
||||
|
||||
Exporter.render_and_save_NFTs(nftName, collectionSize, failed_batch, batch_json_save_path, nftBatch_save_path, enableImages,
|
||||
imageFileFormat, enableAnimations, animationFileFormat, enableModelsBlender,
|
||||
modelFileFormat, fail_state, failed_batch, failed_dna, failed_dna_index
|
||||
)
|
||||
enableAnimations = batch["Generation Save"][-1]["Render_Settings"]["enableAnimations"]
|
||||
animationFileFormat = batch["Generation Save"][-1]["Render_Settings"]["animationFileFormat"]
|
||||
|
||||
enableModelsBlender = batch["Generation Save"][-1]["Render_Settings"]["enableModelsBlender"]
|
||||
modelFileFormat = batch["Generation Save"][-1]["Render_Settings"]["modelFileFormat"]
|
||||
|
||||
enableCustomFields = batch["Generation Save"][-1]["Render_Settings"]["enableCustomFields"]
|
||||
custom_Fields = batch["Generation Save"][-1]["Render_Settings"]["custom_Fields"]
|
||||
|
||||
cardanoMetaDataBool = batch["Generation Save"][-1]["Render_Settings"]["cardanoMetaDataBool"]
|
||||
solanaMetaDataBool = batch["Generation Save"][-1]["Render_Settings"]["solanaMetaDataBool"]
|
||||
erc721MetaData = batch["Generation Save"][-1]["Render_Settings"]["erc721MetaData"]
|
||||
|
||||
cardano_description = batch["Generation Save"][-1]["Render_Settings"]["cardano_description"]
|
||||
solana_description = batch["Generation Save"][-1]["Render_Settings"]["solana_description"]
|
||||
erc721_description = batch["Generation Save"][-1]["Render_Settings"]["erc721_description"]
|
||||
|
||||
enableMaterials = batch["Generation Save"][-1]["Render_Settings"]["enableMaterials"]
|
||||
materialsFile = batch["Generation Save"][-1]["Render_Settings"]["materialsFile"]
|
||||
|
||||
fail_state, failed_batch, failed_dna, failed_dna_index = Checks.check_FailedBatches(batch_json_save_path)
|
||||
|
||||
Exporter.render_and_save_NFTs(input)
|
||||
|
||||
self.report({'INFO'}, f"Resuming Failed Batch Generation!")
|
||||
|
||||
return {"FINISHED"}
|
||||
|
||||
|
||||
class refactor_Batches(bpy.types.Operator):
|
||||
"""Refactor your collection? This action cannot be undone."""
|
||||
bl_idname = 'refactor.batches'
|
||||
|
@ -436,17 +597,17 @@ class refactor_Batches(bpy.types.Operator):
|
|||
bl_description = 'This action cannot be undone.'
|
||||
bl_options = {'REGISTER', 'INTERNAL'}
|
||||
|
||||
@classmethod
|
||||
def poll(cls, context):
|
||||
return True
|
||||
reverse_order: BoolProperty(
|
||||
default=False,
|
||||
name="Reverse Order")
|
||||
|
||||
def execute(self, context):
|
||||
class input:
|
||||
|
||||
class refactor_panel_input:
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
|
||||
custom_Fields_File = bpy.path.abspath(bpy.context.scene.input_tool.customfieldsFile)
|
||||
enableCustomFields = bpy.context.scene.input_tool.enableCustomFields
|
||||
custom_Fields = {}
|
||||
|
||||
cardanoMetaDataBool = bpy.context.scene.input_tool.cardanoMetaDataBool
|
||||
solanaMetaDataBool = bpy.context.scene.input_tool.solanaMetaDataBool
|
||||
|
@ -458,14 +619,14 @@ class refactor_Batches(bpy.types.Operator):
|
|||
|
||||
Blend_My_NFTs_Output, batch_json_save_path, nftBatch_save_path = make_directories(save_path)
|
||||
|
||||
Refactorer.reformatNFTCollection(refactor_panel_input)
|
||||
self.report({'INFO'}, "Batches Refactored, MetaData created!")
|
||||
|
||||
# Passing info to main functions for refactoring:
|
||||
Refactorer.reformatNFTCollection(input)
|
||||
return {"FINISHED"}
|
||||
|
||||
def invoke(self, context, event):
|
||||
return context.window_manager.invoke_confirm(self, event)
|
||||
|
||||
|
||||
class export_settings(bpy.types.Operator):
|
||||
"""Export your settings into a configuration file."""
|
||||
bl_idname = 'export.settings'
|
||||
|
@ -524,6 +685,10 @@ class export_settings(bpy.types.Operator):
|
|||
"#Enable Custom Fields\n"
|
||||
f"enableCustomFields={str(settings.enableCustomFields)}\n"
|
||||
f"customfieldsFile={settings.customfieldsFile}\n"
|
||||
"\n"
|
||||
"#Enable Materials\n"
|
||||
f"enableMaterials={str(settings.enableMaterials)}\n"
|
||||
f"materialsFile={settings.materialsFile}\n"
|
||||
)
|
||||
|
||||
print(output, file=config)
|
||||
|
@ -568,15 +733,52 @@ class BMNFTS_PT_CreateData(bpy.types.Panel):
|
|||
row = layout.row()
|
||||
row.prop(input_tool_scene, "enableLogic")
|
||||
|
||||
# Logic_UIList implementation:
|
||||
if bpy.context.scene.input_tool.enableLogic:
|
||||
layout = self.layout
|
||||
scn = bpy.context.scene
|
||||
|
||||
rows = 2
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "logicFile")
|
||||
row.template_list("CUSTOM_UL_logic_items", "", scn, "logic_fields", scn,
|
||||
"logic_fields_index", rows=rows)
|
||||
|
||||
col = row.column(align=True)
|
||||
col.operator("logic_uilist.logic_list_action", icon='ZOOM_IN', text="").action = 'ADD'
|
||||
col.operator("logic_uilist.logic_list_action", icon='ZOOM_OUT', text="").action = 'REMOVE'
|
||||
col.separator()
|
||||
col.operator("logic_uilist.logic_list_action", icon='TRIA_UP', text="").action = 'UP'
|
||||
col.operator("logic_uilist.logic_list_action", icon='TRIA_DOWN', text="").action = 'DOWN'
|
||||
|
||||
row = layout.row()
|
||||
col = row.column(align=True)
|
||||
row = col.row(align=True)
|
||||
row.operator("logic_uilist.logic_clear_list", icon="X")
|
||||
row = col.row(align=True)
|
||||
row.label(text=f"*Field Names must be unique.")
|
||||
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "enable_Logic_Json")
|
||||
|
||||
if bpy.context.scene.input_tool.enable_Logic_Json:
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "logicFile")
|
||||
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "enableMaterials")
|
||||
|
||||
if bpy.context.scene.input_tool.enableMaterials:
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "materialsFile")
|
||||
|
||||
row = layout.row()
|
||||
self.layout.operator("create.data", icon='DISCLOSURE_TRI_RIGHT', text="Create Data")
|
||||
row = layout.row()
|
||||
layout.label(text=f"{BMNFTS_VERSION}")
|
||||
|
||||
|
||||
class BMNFTS_PT_GenerateNFTs(bpy.types.Panel):
|
||||
bl_label = "Generate NFTs"
|
||||
bl_label = "Generate NFTs & Create Metadata"
|
||||
bl_idname = "BMNFTS_PT_GenerateNFTs"
|
||||
bl_space_type = 'VIEW_3D'
|
||||
bl_region_type = 'UI'
|
||||
|
@ -605,40 +807,6 @@ class BMNFTS_PT_GenerateNFTs(bpy.types.Panel):
|
|||
if bpy.context.scene.input_tool.modelBool:
|
||||
row.prop(input_tool_scene, "modelEnum")
|
||||
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "batchToGenerate")
|
||||
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
Blend_My_NFTs_Output = os.path.join(save_path, "Blend_My_NFTs Output", "NFT_Data")
|
||||
batch_json_save_path = os.path.join(Blend_My_NFTs_Output, "Batch_Data")
|
||||
nftBatch_save_path = os.path.join(save_path, "Blend_My_NFTs Output", "Generated NFT Batches")
|
||||
|
||||
fail_state, failed_batch, failed_dna, failed_dna_index = Checks.check_FailedBatches(batch_json_save_path)
|
||||
|
||||
if fail_state:
|
||||
row = layout.row()
|
||||
self.layout.operator("exporter.nfts", icon='RENDER_RESULT', text="Generate NFTs")
|
||||
|
||||
row = layout.row()
|
||||
row.alert = True
|
||||
row.operator("exporter.resume_nfts", icon='ERROR', text="Resume Failed Batch")
|
||||
|
||||
if not fail_state:
|
||||
row = layout.row()
|
||||
self.layout.operator("exporter.nfts", icon='RENDER_RESULT', text="Generate NFTs")
|
||||
|
||||
class BMNFTS_PT_Refactor(bpy.types.Panel):
|
||||
bl_label = "Refactor Batches & Create Metadata"
|
||||
bl_idname = "BMNFTS_PT_Refactor"
|
||||
bl_space_type = 'VIEW_3D'
|
||||
bl_region_type = 'UI'
|
||||
bl_category = 'Blend_My_NFTs'
|
||||
|
||||
def draw(self, context):
|
||||
layout = self.layout
|
||||
scene = context.scene
|
||||
input_tool_scene = scene.input_tool
|
||||
|
||||
row = layout.row()
|
||||
layout.label(text="Meta Data format:")
|
||||
|
||||
|
@ -674,12 +842,73 @@ class BMNFTS_PT_Refactor(bpy.types.Panel):
|
|||
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "enableCustomFields")
|
||||
|
||||
# Custom Metadata Fields UIList:
|
||||
if bpy.context.scene.input_tool.enableCustomFields:
|
||||
layout = self.layout
|
||||
scn = bpy.context.scene
|
||||
|
||||
rows = 2
|
||||
row = layout.row()
|
||||
row.prop(input_tool_scene, "customfieldsFile")
|
||||
row.template_list("CUSTOM_UL_custom_metadata_fields_items", "", scn, "custom_metadata_fields", scn, "custom_metadata_fields_index", rows=rows)
|
||||
|
||||
col = row.column(align=True)
|
||||
col.operator("custom_metadata_fields_uilist.list_action", icon='ZOOM_IN', text="").action = 'ADD'
|
||||
col.operator("custom_metadata_fields_uilist.list_action", icon='ZOOM_OUT', text="").action = 'REMOVE'
|
||||
col.separator()
|
||||
col.operator("custom_metadata_fields_uilist.list_action", icon='TRIA_UP', text="").action = 'UP'
|
||||
col.operator("custom_metadata_fields_uilist.list_action", icon='TRIA_DOWN', text="").action = 'DOWN'
|
||||
|
||||
row = layout.row()
|
||||
col = row.column(align=True)
|
||||
row = col.row(align=True)
|
||||
row.label(text=f"*Field Names must be unique.")
|
||||
row = col.row(align=True)
|
||||
row.operator("custom_metadata_fields_uilist.clear_list", icon="X")
|
||||
|
||||
row = layout.row()
|
||||
self.layout.operator("refactor.batches", icon='FOLDER_REDIRECT', text="Refactor Batches & Create Metadata")
|
||||
row.prop(input_tool_scene, "batchToGenerate")
|
||||
|
||||
save_path = bpy.path.abspath(bpy.context.scene.input_tool.save_path)
|
||||
Blend_My_NFTs_Output = os.path.join(save_path, "Blend_My_NFTs Output", "NFT_Data")
|
||||
batch_json_save_path = os.path.join(Blend_My_NFTs_Output, "Batch_Data")
|
||||
nftBatch_save_path = os.path.join(save_path, "Blend_My_NFTs Output", "Generated NFT Batches")
|
||||
|
||||
fail_state, failed_batch, failed_dna, failed_dna_index = Checks.check_FailedBatches(batch_json_save_path)
|
||||
|
||||
if fail_state:
|
||||
row = layout.row()
|
||||
self.layout.operator("exporter.nfts", icon='RENDER_RESULT', text="Generate NFTs & Create Metadata")
|
||||
|
||||
row = layout.row()
|
||||
row.alert = True
|
||||
row.operator("exporter.resume_nfts", icon='ERROR', text="Resume Failed Batch")
|
||||
|
||||
if not fail_state:
|
||||
row = layout.row()
|
||||
self.layout.operator("exporter.nfts", icon='RENDER_RESULT', text="Generate NFTs & Create Metadata")
|
||||
|
||||
|
||||
class BMNFTS_PT_Refactor(bpy.types.Panel):
|
||||
bl_label = "Refactor Batches"
|
||||
bl_idname = "BMNFTS_PT_Refactor"
|
||||
bl_space_type = 'VIEW_3D'
|
||||
bl_region_type = 'UI'
|
||||
bl_category = 'Blend_My_NFTs'
|
||||
|
||||
def draw(self, context):
|
||||
layout = self.layout
|
||||
scene = context.scene
|
||||
input_tool_scene = scene.input_tool
|
||||
|
||||
row = layout.row()
|
||||
layout.label(text="Ensure all batches have been created before refactoring.")
|
||||
layout.label(text="Refactoring combines all batches into one easy to manage folder.")
|
||||
|
||||
|
||||
row = layout.row()
|
||||
self.layout.operator("refactor.batches", icon='FOLDER_REDIRECT', text="Refactor Batches")
|
||||
|
||||
|
||||
class BMNFTS_PT_Other(bpy.types.Panel):
|
||||
bl_label = "Other"
|
||||
|
@ -709,6 +938,11 @@ class BMNFTS_PT_Other(bpy.types.Panel):
|
|||
row = layout.row()
|
||||
layout.label(text=f"**Set a Save Path in Create NFT Data to Export Settings")
|
||||
|
||||
row = layout.row()
|
||||
|
||||
row = layout.row()
|
||||
layout.label(text=f"Looking for help?")
|
||||
|
||||
row = layout.row()
|
||||
row.operator("wm.url_open", text="Blend_My_NFTs Documentation",
|
||||
icon='URL').url = "https://github.com/torrinworx/Blend_My_NFTs"
|
||||
|
@ -716,6 +950,12 @@ class BMNFTS_PT_Other(bpy.types.Panel):
|
|||
row = layout.row()
|
||||
row.operator("wm.url_open", text="YouTube Tutorials",
|
||||
icon='URL').url = "https://www.youtube.com/watch?v=ygKJYz4BjRs&list=PLuVvzaanutXcYtWmPVKu2bx83EYNxLRsX"
|
||||
row = layout.row()
|
||||
row.operator("wm.url_open", text="Join Our Discord Community!",
|
||||
icon='URL').url = "https://discord.gg/UpZt5Un57t"
|
||||
|
||||
row = layout.row()
|
||||
layout.label(text=f"{BMNFTS_VERSION}, {LAST_UPDATED}")
|
||||
|
||||
|
||||
# ======== Blender add-on register/unregister handling ======== #
|
||||
|
@ -735,7 +975,8 @@ classes = (
|
|||
BMNFTS_PT_GenerateNFTs,
|
||||
BMNFTS_PT_Refactor,
|
||||
BMNFTS_PT_Other,
|
||||
)
|
||||
) + Custom_Metadata_UIList.classes_Custom_Metadata_UIList + Logic_UIList.classes_Logic_UIList
|
||||
|
||||
|
||||
def register():
|
||||
for cls in classes:
|
||||
|
@ -743,12 +984,24 @@ def register():
|
|||
|
||||
bpy.types.Scene.input_tool = bpy.props.PointerProperty(type=BMNFTS_PGT_Input_Properties)
|
||||
|
||||
bpy.types.Scene.custom_metadata_fields = CollectionProperty(type=Custom_Metadata_UIList.CUSTOM_custom_metadata_fields_objectCollection)
|
||||
bpy.types.Scene.custom_metadata_fields_index = IntProperty()
|
||||
|
||||
bpy.types.Scene.logic_fields = CollectionProperty(type=Logic_UIList.CUSTOM_logic_objectCollection)
|
||||
bpy.types.Scene.logic_fields_index = IntProperty()
|
||||
|
||||
def unregister():
|
||||
for cls in classes:
|
||||
for cls in reversed(classes):
|
||||
bpy.utils.unregister_class(cls)
|
||||
|
||||
del bpy.types.Scene.input_tool
|
||||
|
||||
del bpy.types.Scene.custom_metadata_fields
|
||||
del bpy.types.Scene.custom_metadata_fields_index
|
||||
|
||||
del bpy.types.Scene.logic_fields
|
||||
del bpy.types.Scene.logic_fields_index
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
register()
|
||||
|
|
|
@ -1,70 +0,0 @@
|
|||
# Purpose:
|
||||
# This file sorts the NFT DNA from NFTRecord.json and exports it to a given number of Batch#.json files set by nftsPerBatch
|
||||
# in config.py.
|
||||
|
||||
import bpy
|
||||
import os
|
||||
import json
|
||||
import random
|
||||
|
||||
|
||||
def makeBatches(nftName, maxNFTs, nftsPerBatch, save_path, batch_json_save_path):
|
||||
Blend_My_NFTs_Output = os.path.join(save_path, "Blend_My_NFTs Output", "NFT_Data")
|
||||
NFTRecord_save_path = os.path.join(Blend_My_NFTs_Output, "NFTRecord.json")
|
||||
|
||||
DataDictionary = json.load(open(NFTRecord_save_path))
|
||||
|
||||
numNFTsGenerated = DataDictionary["numNFTsGenerated"]
|
||||
hierarchy = DataDictionary["hierarchy"]
|
||||
DNAList = DataDictionary["DNAList"]
|
||||
|
||||
numBatches = maxNFTs / nftsPerBatch
|
||||
|
||||
print(f"To generate batches of {nftsPerBatch} DNA sequences per batch, with a total of {numNFTsGenerated}"
|
||||
f" possible NFT DNA sequences, the number of batches generated will be {numBatches}")
|
||||
|
||||
# Clears the Batch Data folder of Batches:
|
||||
batchList = os.listdir(batch_json_save_path)
|
||||
|
||||
if batchList:
|
||||
for i in batchList:
|
||||
batch = os.path.join(batch_json_save_path, i)
|
||||
if os.path.exists(batch):
|
||||
os.remove(
|
||||
os.path.join(batch_json_save_path, i)
|
||||
)
|
||||
|
||||
i = 0
|
||||
while i < numBatches:
|
||||
batchDictionary = {}
|
||||
BatchDNAList = []
|
||||
|
||||
j = 0
|
||||
while (j < nftsPerBatch) and (DNAList):
|
||||
oneDNA = random.choice(DNAList)
|
||||
BatchDNAList.append({
|
||||
oneDNA: {"Complete": False}
|
||||
})
|
||||
DNAList.remove(oneDNA)
|
||||
j += 1
|
||||
|
||||
batchDictionary["NFTs_in_Batch"] = int(len(BatchDNAList))
|
||||
batchDictionary["hierarchy"] = hierarchy
|
||||
batchDictionary["BatchDNAList"] = BatchDNAList
|
||||
|
||||
batchDictionaryObject = json.dumps(batchDictionary, indent=1, ensure_ascii=True)
|
||||
|
||||
with open(os.path.join(batch_json_save_path, ("Batch{}.json".format(i + 1))), "w") as outfile:
|
||||
outfile.write(batchDictionaryObject)
|
||||
|
||||
i += 1
|
||||
|
||||
if len(DNAList) > 0: # Add to Checks.py
|
||||
print(f"One batch could not be filled completely and will contain {len(DNAList)} NFTs.")
|
||||
|
||||
incompleteBatch = {"NFTs_in_Batch": int(len(DNAList)), "hierarchy": hierarchy, "BatchDNAList": DNAList}
|
||||
|
||||
incompleteBatch = json.dumps(incompleteBatch, indent=1, ensure_ascii=True)
|
||||
|
||||
with open(os.path.join(batch_json_save_path, ("Batch{}.json".format(i + 1))), "w") as outfile2:
|
||||
outfile2.write(incompleteBatch)
|
|
@ -27,28 +27,31 @@ def check_Scene(): # Not complete
|
|||
variant_naming_conventions = None # True if all variants in Blender scene follow BMNFTs naming conventions
|
||||
object_placing_conventions = None # True if all objects are within either Script_Ignore or a variant collection
|
||||
|
||||
hierarchy = DNA_Generator.get_hierarchy()
|
||||
|
||||
# script_ignore_exists:
|
||||
try:
|
||||
scriptIgnoreCollection = bpy.data.collections["Script_Ignore"]
|
||||
script_ignore_exists = True
|
||||
except KeyError:
|
||||
raise TypeError(
|
||||
f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n"
|
||||
f"collection to your Blender scene and ensure the name is exactly 'Script_Ignore'. For more information, "
|
||||
f"Add a Script_Ignore collection to your Blender scene and ensure the name is exactly 'Script_Ignore'. For more information, "
|
||||
f"see:"
|
||||
f"\nhttps://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n{bcolors.RESET}"
|
||||
)
|
||||
else:
|
||||
script_ignore_exists = True
|
||||
|
||||
hierarchy = DNA_Generator.get_hierarchy()
|
||||
collections = bpy.context.scene.collection
|
||||
print(collections)
|
||||
|
||||
# attribute_naming_conventions
|
||||
|
||||
def check_Rarity(hierarchy, DNAList, save_path):
|
||||
def check_Rarity(hierarchy, DNAListFormatted, save_path):
|
||||
"""Checks rarity percentage of each Variant, then sends it to RarityData.json in NFT_Data folder."""
|
||||
|
||||
DNAList = []
|
||||
for i in DNAListFormatted:
|
||||
DNAList.append(list(i.keys())[0])
|
||||
|
||||
|
||||
numNFTsGenerated = len(DNAList)
|
||||
|
||||
numDict = defaultdict(list)
|
||||
|
@ -109,8 +112,12 @@ def check_Rarity(hierarchy, DNAList, save_path):
|
|||
path = os.path.join(save_path, "RarityData.json")
|
||||
print(bcolors.OK + f"Rarity Data has been saved to {path}." + bcolors.RESET)
|
||||
|
||||
def check_Duplicates(DNAList):
|
||||
def check_Duplicates(DNAListFormatted):
|
||||
"""Checks if there are duplicates in DNAList before NFTRecord.json is sent to JSON file."""
|
||||
DNAList = []
|
||||
for i in DNAListFormatted:
|
||||
DNAList.append(list(i.keys())[0])
|
||||
|
||||
|
||||
duplicates = 0
|
||||
seen = set()
|
||||
|
@ -121,7 +128,7 @@ def check_Duplicates(DNAList):
|
|||
duplicates += 1
|
||||
seen.add(x)
|
||||
|
||||
print(f"NFTRecord.json contains {duplicates} duplicate NFT DNA.")
|
||||
print(f"\nNFTRecord.json contains {duplicates} duplicate NFT DNA.")
|
||||
|
||||
def check_FailedBatches(batch_json_save_path):
|
||||
fail_state = False
|
||||
|
@ -137,7 +144,7 @@ def check_FailedBatches(batch_json_save_path):
|
|||
NFTs_in_Batch = batch["NFTs_in_Batch"]
|
||||
if "Generation Save" in batch:
|
||||
dna_generated = batch["Generation Save"][-1]["DNA Generated"]
|
||||
if dna_generated < NFTs_in_Batch:
|
||||
if dna_generated is not None and dna_generated < NFTs_in_Batch:
|
||||
fail_state = True
|
||||
failed_batch = int(i.removeprefix("Batch").removesuffix(".json"))
|
||||
failed_dna = dna_generated
|
||||
|
|
|
@ -7,18 +7,21 @@ import os
|
|||
removeList = [".gitignore", ".DS_Store", "desktop.ini", ".ini"]
|
||||
|
||||
def remove_file_by_extension(dirlist):
|
||||
"""
|
||||
Checks if a given directory list contains any of the files or file extensions listed above, if so, remove them from
|
||||
list and return a clean dir list. These files interfer with BMNFTs operations and should be removed whenever dealing
|
||||
with directories.
|
||||
"""
|
||||
"""
|
||||
Checks if a given directory list contains any of the files or file extensions listed above, if so, remove them from
|
||||
list and return a clean dir list. These files interfer with BMNFTs operations and should be removed whenever dealing
|
||||
with directories.
|
||||
"""
|
||||
|
||||
return_dirs = []
|
||||
for directory in dirlist:
|
||||
if not str(os.path.splitext(directory)[1]) in removeList:
|
||||
return_dirs.append(directory)
|
||||
if str(type(dirlist)) == "<class 'list'>":
|
||||
dirlist = list(dirlist) # converts single string path to list if dir pasted as string
|
||||
|
||||
return return_dirs
|
||||
return_dirs = []
|
||||
for directory in dirlist:
|
||||
if not str(os.path.split(directory)[1]) in removeList:
|
||||
return_dirs.append(directory)
|
||||
|
||||
return return_dirs
|
||||
|
||||
|
||||
class bcolors:
|
||||
|
|
|
@ -10,7 +10,7 @@ import json
|
|||
import random
|
||||
from functools import partial
|
||||
from .loading_animation import Loader
|
||||
from . import Rarity, Logic, Checks
|
||||
from . import Rarity, Logic, Checks, Material_Generator
|
||||
from .Constants import bcolors, removeList, remove_file_by_extension
|
||||
|
||||
|
||||
|
@ -81,12 +81,22 @@ def get_hierarchy():
|
|||
"""
|
||||
allAttDataList = {}
|
||||
for i in attributeVariants:
|
||||
# Check if name follows naming conventions:
|
||||
if i.count("_") > 2:
|
||||
raise Exception(
|
||||
f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n"
|
||||
f"There is a naming issue with the following Attribute/Variant: '{i}'\n"
|
||||
f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}"
|
||||
f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n"
|
||||
)
|
||||
|
||||
def getName(i):
|
||||
"""
|
||||
Returns the name of "i" attribute variant
|
||||
"""
|
||||
|
||||
name = i.split("_")[0]
|
||||
|
||||
return name
|
||||
|
||||
def getOrder_rarity(i):
|
||||
|
@ -101,8 +111,25 @@ def get_hierarchy():
|
|||
name = getName(i)
|
||||
orderRarity = getOrder_rarity(i)
|
||||
|
||||
number = orderRarity[0]
|
||||
rarity = orderRarity[1]
|
||||
try:
|
||||
number = orderRarity[0]
|
||||
except:
|
||||
raise Exception(
|
||||
f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n"
|
||||
f"There is a naming issue with the following Attribute/Variant: '{i}'\n"
|
||||
f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}"
|
||||
f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n"
|
||||
)
|
||||
|
||||
try:
|
||||
rarity = orderRarity[1]
|
||||
except:
|
||||
raise Exception(
|
||||
f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n"
|
||||
f"There is a naming issue with the following Attribute/Variant: '{i}'\n"
|
||||
f"Review the naming convention of Attribute and Variant collections here:\n{bcolors.RESET}"
|
||||
f"https://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n"
|
||||
)
|
||||
|
||||
eachObject = {"name": name, "number": number, "rarity": rarity}
|
||||
allAttDataList[i] = eachObject
|
||||
|
@ -126,7 +153,7 @@ def get_hierarchy():
|
|||
|
||||
return hierarchy
|
||||
|
||||
def generateNFT_DNA(collectionSize, logicFile, enableRarity, enableLogic):
|
||||
def generateNFT_DNA(collectionSize, enableRarity, enableLogic, logicFile, enableMaterials, materialsFile):
|
||||
"""
|
||||
Returns batchDataDictionary containing the number of NFT combinations, hierarchy, and the DNAList.
|
||||
"""
|
||||
|
@ -163,6 +190,7 @@ def generateNFT_DNA(collectionSize, logicFile, enableRarity, enableLogic):
|
|||
def singleCompleteDNA():
|
||||
"""This function applies Rarity and Logic to a single DNA created by createDNASingle() if Rarity or Logic specified"""
|
||||
singleDNA = ""
|
||||
# Comments for debugging random, rarity, logic, and materials.
|
||||
if not enableRarity:
|
||||
singleDNA = createDNArandom()
|
||||
# print("============")
|
||||
|
@ -172,8 +200,14 @@ def generateNFT_DNA(collectionSize, logicFile, enableRarity, enableLogic):
|
|||
|
||||
if enableLogic:
|
||||
singleDNA = Logic.logicafyDNAsingle(hierarchy, singleDNA, logicFile)
|
||||
# print(f"Logic DNA: {singleDNA}")
|
||||
# print(f"Original DNA: {singleDNA}")
|
||||
# print("============\n")
|
||||
|
||||
if enableMaterials:
|
||||
singleDNA = Material_Generator.apply_materials(hierarchy, singleDNA, materialsFile)
|
||||
# print(f"Materials DNA: {singleDNA}")
|
||||
# print("============\n")
|
||||
|
||||
return singleDNA
|
||||
|
||||
def create_DNAList():
|
||||
|
@ -185,9 +219,21 @@ def generateNFT_DNA(collectionSize, logicFile, enableRarity, enableLogic):
|
|||
|
||||
DNASetReturn |= {''.join([dnaPushToList()]) for _ in range(collectionSize - len(DNASetReturn))}
|
||||
|
||||
DNAListReturn = list(DNASetReturn)
|
||||
DNAListUnformatted = list(DNASetReturn)
|
||||
|
||||
return DNAListReturn
|
||||
DNAListFormatted = []
|
||||
DNA_Counter = 1
|
||||
for i in DNAListUnformatted:
|
||||
DNAListFormatted.append({
|
||||
i: {
|
||||
"Complete": False,
|
||||
"Order_Num": DNA_Counter
|
||||
}
|
||||
})
|
||||
|
||||
DNA_Counter += 1
|
||||
|
||||
return DNAListFormatted
|
||||
|
||||
DNAList = create_DNAList()
|
||||
|
||||
|
@ -202,7 +248,63 @@ def generateNFT_DNA(collectionSize, logicFile, enableRarity, enableLogic):
|
|||
|
||||
return DataDictionary
|
||||
|
||||
def send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, Blend_My_NFTs_Output):
|
||||
def makeBatches(collectionSize, nftsPerBatch, save_path, batch_json_save_path):
|
||||
"""
|
||||
Sorts through all the batches and outputs a given number of batches depending on collectionSize and nftsPerBatch.
|
||||
These files are then saved as Batch#.json files to batch_json_save_path
|
||||
"""
|
||||
|
||||
# Clears the Batch Data folder of Batches:
|
||||
batchList = os.listdir(batch_json_save_path)
|
||||
if batchList:
|
||||
for i in batchList:
|
||||
batch = os.path.join(batch_json_save_path, i)
|
||||
if os.path.exists(batch):
|
||||
os.remove(
|
||||
os.path.join(batch_json_save_path, i)
|
||||
)
|
||||
|
||||
Blend_My_NFTs_Output = os.path.join(save_path, "Blend_My_NFTs Output", "NFT_Data")
|
||||
NFTRecord_save_path = os.path.join(Blend_My_NFTs_Output, "NFTRecord.json")
|
||||
DataDictionary = json.load(open(NFTRecord_save_path))
|
||||
|
||||
numNFTsGenerated = DataDictionary["numNFTsGenerated"]
|
||||
hierarchy = DataDictionary["hierarchy"]
|
||||
DNAList = DataDictionary["DNAList"]
|
||||
|
||||
numBatches = collectionSize // nftsPerBatch
|
||||
remainder_dna = collectionSize % nftsPerBatch
|
||||
if remainder_dna > 0:
|
||||
numBatches += 1
|
||||
|
||||
print(f"To generate batches of {nftsPerBatch} DNA sequences per batch, with a total of {numNFTsGenerated}"
|
||||
f" possible NFT DNA sequences, the number of batches generated will be {numBatches}")
|
||||
|
||||
batches_dna_list = []
|
||||
|
||||
for i in range(numBatches):
|
||||
BatchDNAList = []
|
||||
if i != range(numBatches)[-1]:
|
||||
BatchDNAList = list(DNAList[0:nftsPerBatch])
|
||||
batches_dna_list.append(BatchDNAList)
|
||||
|
||||
DNAList = [x for x in DNAList if x not in BatchDNAList]
|
||||
else:
|
||||
BatchDNAList = DNAList
|
||||
|
||||
batchDictionary = {
|
||||
"NFTs_in_Batch": int(len(BatchDNAList)),
|
||||
"hierarchy": hierarchy,
|
||||
"BatchDNAList": BatchDNAList
|
||||
}
|
||||
|
||||
batchDictionary = json.dumps(batchDictionary, indent=1, ensure_ascii=True)
|
||||
|
||||
with open(os.path.join(batch_json_save_path, f"Batch{i + 1}.json"), "w") as outfile:
|
||||
outfile.write(batchDictionary)
|
||||
|
||||
def send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, enableLogic, logicFile, enableMaterials,
|
||||
materialsFile, Blend_My_NFTs_Output, batch_json_save_path):
|
||||
"""
|
||||
Creates NFTRecord.json file and sends "batchDataDictionary" to it. NFTRecord.json is a permanent record of all DNA
|
||||
you've generated with all attribute variants. If you add new variants or attributes to your .blend file, other scripts
|
||||
|
@ -232,7 +334,8 @@ def send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, e
|
|||
|
||||
def create_nft_data():
|
||||
try:
|
||||
DataDictionary = generateNFT_DNA(collectionSize, logicFile, enableRarity, enableLogic)
|
||||
DataDictionary = generateNFT_DNA(collectionSize, enableRarity, enableLogic, logicFile, enableMaterials,
|
||||
materialsFile)
|
||||
NFTRecord_save_path = os.path.join(Blend_My_NFTs_Output, "NFTRecord.json")
|
||||
|
||||
# Checks:
|
||||
|
@ -276,6 +379,7 @@ def send_To_Record_JSON(collectionSize, nftsPerBatch, save_path, enableRarity, e
|
|||
# Loading Animation:
|
||||
loading = Loader(f'Creating NFT DNA...', '').start()
|
||||
create_nft_data()
|
||||
makeBatches(collectionSize, nftsPerBatch, save_path, batch_json_save_path)
|
||||
loading.stop()
|
||||
|
||||
time_end = time.time()
|
||||
|
|
280
main/Exporter.py
280
main/Exporter.py
|
@ -9,6 +9,7 @@ import json
|
|||
import datetime
|
||||
from .loading_animation import Loader
|
||||
from .Constants import bcolors, removeList, remove_file_by_extension
|
||||
from .Metadata import createCardanoMetadata, createSolanaMetaData, createErc721MetaData
|
||||
|
||||
|
||||
# Save info
|
||||
|
@ -18,10 +19,10 @@ def save_batch(batch, file_name):
|
|||
with open(os.path.join(file_name), 'w') as outfile:
|
||||
outfile.write(saved_batch + '\n')
|
||||
|
||||
def save_generation_state(batchToGenerate, batch_json_save_path, nftBatch_save_path, enableImages, imageFileFormat, enableAnimations,
|
||||
animationFileFormat, enableModelsBlender, modelFileFormat):
|
||||
|
||||
def save_generation_state(input):
|
||||
"""Saves date and time of generation start, and generation types; Images, Animations, 3D Models, and the file types for each."""
|
||||
file_name = os.path.join(batch_json_save_path, "Batch{}.json".format(batchToGenerate))
|
||||
file_name = os.path.join(input.batch_json_save_path, "Batch{}.json".format(input.batchToGenerate))
|
||||
batch = json.load(open(file_name))
|
||||
|
||||
CURRENT_TIME = datetime.datetime.now().strftime("%H:%M:%S")
|
||||
|
@ -39,26 +40,52 @@ def save_generation_state(batchToGenerate, batch_json_save_path, nftBatch_save_p
|
|||
"DNA Generated": None,
|
||||
"Generation Start Date and Time": [CURRENT_TIME, CURRENT_DATE, LOCAL_TIMEZONE],
|
||||
"Render_Settings": {
|
||||
"nftBatch_save_path": nftBatch_save_path,
|
||||
"enableImages": enableImages,
|
||||
"imageFileFormat": imageFileFormat,
|
||||
"enableAnimations": enableAnimations,
|
||||
"animationFileFormat": animationFileFormat,
|
||||
"enableModelsBlender": enableModelsBlender,
|
||||
"modelFileFormat": modelFileFormat,
|
||||
|
||||
"nftName": input.nftName,
|
||||
"save_path": input.save_path,
|
||||
"batchToGenerate": input.batchToGenerate,
|
||||
"collectionSize": input.collectionSize,
|
||||
|
||||
"Blend_My_NFTs_Output": input.Blend_My_NFTs_Output,
|
||||
"batch_json_save_path": input.batch_json_save_path,
|
||||
"nftBatch_save_path": input.nftBatch_save_path,
|
||||
|
||||
"enableImages": input.enableImages,
|
||||
"imageFileFormat": input.imageFileFormat,
|
||||
|
||||
"enableAnimations": input.enableAnimations,
|
||||
"animationFileFormat": input.animationFileFormat,
|
||||
|
||||
"enableModelsBlender": input.enableModelsBlender,
|
||||
"modelFileFormat": input.modelFileFormat,
|
||||
|
||||
"enableCustomFields": input.enableCustomFields,
|
||||
"custom_Fields": input.custom_Fields,
|
||||
|
||||
"cardanoMetaDataBool": input.cardanoMetaDataBool,
|
||||
"solanaMetaDataBool": input.solanaMetaDataBool,
|
||||
"erc721MetaData": input.erc721MetaData,
|
||||
|
||||
"cardano_description": input.cardano_description,
|
||||
"solana_description": input.solana_description,
|
||||
"erc721_description": input.erc721_description,
|
||||
|
||||
"enableMaterials": input.enableMaterials,
|
||||
"materialsFile": input.materialsFile,
|
||||
|
||||
},
|
||||
})
|
||||
|
||||
save_batch(batch, file_name)
|
||||
|
||||
def save_completed(single_dna, a, x, batch_json_save_path, batchToGenerate):
|
||||
|
||||
def save_completed(full_single_dna, a, x, batch_json_save_path, batchToGenerate):
|
||||
"""Saves progress of rendering to batch.json file."""
|
||||
|
||||
file_name = os.path.join(batch_json_save_path, "Batch{}.json".format(batchToGenerate))
|
||||
batch = json.load(open(file_name))
|
||||
|
||||
index = batch["BatchDNAList"].index(a)
|
||||
batch["BatchDNAList"][index][single_dna]["Complete"] = True
|
||||
batch["BatchDNAList"][index][full_single_dna]["Complete"] = True
|
||||
batch["Generation Save"][-1]["DNA Generated"] = x
|
||||
|
||||
save_batch(batch, file_name)
|
||||
|
@ -79,36 +106,41 @@ def getBatchData(batchToGenerate, batch_json_save_path):
|
|||
|
||||
return NFTs_in_Batch, hierarchy, BatchDNAList
|
||||
|
||||
def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path, nftBatch_save_path, enableImages,
|
||||
imageFileFormat, enableAnimations, animationFileFormat, enableModelsBlender,
|
||||
modelFileFormat, fail_state, failed_batch, failed_dna, failed_dna_index
|
||||
):
|
||||
|
||||
def render_and_save_NFTs(input):
|
||||
"""
|
||||
Renders the NFT DNA in a Batch#.json, where # is renderBatch in config.py. Turns off the viewport camera and
|
||||
the render camera for all items in hierarchy.
|
||||
"""
|
||||
|
||||
NFTs_in_Batch, hierarchy, BatchDNAList = getBatchData(batchToGenerate, batch_json_save_path)
|
||||
print(f"\nFAILED BATCH = {input.failed_batch}\n")
|
||||
print(f"\nBATCH TO GENERATE = {input.batchToGenerate}\n")
|
||||
|
||||
time_start_1 = time.time()
|
||||
|
||||
if fail_state:
|
||||
for a in range(failed_dna):
|
||||
if input.fail_state:
|
||||
NFTs_in_Batch, hierarchy, BatchDNAList = getBatchData(input.failed_batch, input.batch_json_save_path)
|
||||
for a in range(input.failed_dna):
|
||||
del BatchDNAList[0]
|
||||
x = failed_dna + 1
|
||||
x = input.failed_dna + 1
|
||||
|
||||
else:
|
||||
save_generation_state(batchToGenerate, batch_json_save_path, nftBatch_save_path, enableImages, imageFileFormat,
|
||||
enableAnimations,
|
||||
animationFileFormat, enableModelsBlender, modelFileFormat)
|
||||
NFTs_in_Batch, hierarchy, BatchDNAList = getBatchData(input.batchToGenerate, input.batch_json_save_path)
|
||||
save_generation_state(input)
|
||||
x = 1
|
||||
|
||||
if input.enableMaterials:
|
||||
materialsFile = json.load(open(input.materialsFile))
|
||||
|
||||
for a in BatchDNAList:
|
||||
single_dna = list(a.keys())[0]
|
||||
for i in hierarchy:
|
||||
for j in hierarchy[i]:
|
||||
bpy.data.collections[j].hide_render = True
|
||||
bpy.data.collections[j].hide_viewport = True
|
||||
full_single_dna = list(a.keys())[0]
|
||||
Order_Num = a[full_single_dna]['Order_Num']
|
||||
|
||||
# Material handling:
|
||||
if input.enableMaterials:
|
||||
single_dna, material_dna = full_single_dna.split(':')
|
||||
|
||||
if not input.enableMaterials:
|
||||
single_dna = full_single_dna
|
||||
|
||||
def match_DNA_to_Variant(single_dna):
|
||||
"""
|
||||
|
@ -129,10 +161,84 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
dnaDictionary.update({x: k})
|
||||
return dnaDictionary
|
||||
|
||||
dnaDictionary = match_DNA_to_Variant(single_dna)
|
||||
name = nftName + "_" + str(x)
|
||||
def match_materialDNA_to_Material(single_dna, material_dna, materialsFile):
|
||||
"""
|
||||
Matches the Material DNA to it's selected Materials unless a 0 is present meaning no material for that variant was selected.
|
||||
"""
|
||||
listAttributes = list(hierarchy.keys())
|
||||
listDnaDecunstructed = single_dna.split('-')
|
||||
listMaterialDNADeconstructed = material_dna.split('-')
|
||||
|
||||
print(f"\n{bcolors.OK}|---Generating NFT {x}/{NFTs_in_Batch} ---|{bcolors.RESET}")
|
||||
full_dna_dict = {}
|
||||
|
||||
for attribute, variant, material in zip(listAttributes, listDnaDecunstructed, listMaterialDNADeconstructed):
|
||||
|
||||
for var in hierarchy[attribute]:
|
||||
if hierarchy[attribute][var]['number'] == variant:
|
||||
variant = var
|
||||
|
||||
if material != '0':
|
||||
for variant_m in materialsFile:
|
||||
if variant == variant_m:
|
||||
for mat in materialsFile[variant_m]["Material List"]:
|
||||
if mat.split('_')[1] == material:
|
||||
material = mat
|
||||
|
||||
full_dna_dict[variant] = material
|
||||
|
||||
return full_dna_dict
|
||||
|
||||
metadataMaterialDict = {}
|
||||
|
||||
if input.enableMaterials:
|
||||
materialdnaDictionary = match_materialDNA_to_Material(single_dna, material_dna, materialsFile)
|
||||
|
||||
for var_mat in list(materialdnaDictionary.keys()):
|
||||
if materialdnaDictionary[var_mat] != '0':
|
||||
if not materialsFile[var_mat]['Variant Objects']:
|
||||
"""
|
||||
If objects to apply material to not specified, apply to all objects in Variant collection.
|
||||
"""
|
||||
metadataMaterialDict[var_mat] = materialdnaDictionary[var_mat]
|
||||
|
||||
for obj in bpy.data.collections[var_mat].all_objects:
|
||||
selected_object = bpy.data.objects.get(obj.name)
|
||||
selected_object.active_material = bpy.data.materials[materialdnaDictionary[var_mat]]
|
||||
|
||||
if materialsFile[var_mat]['Variant Objects']:
|
||||
"""
|
||||
If objects to apply material to are specified, apply material only to objects specified withing the Variant collection.
|
||||
"""
|
||||
metadataMaterialDict[var_mat] = materialdnaDictionary[var_mat]
|
||||
|
||||
for obj in materialsFile[var_mat]['Variant Objects']:
|
||||
selected_object = bpy.data.objects.get(obj)
|
||||
selected_object.active_material = bpy.data.materials[materialdnaDictionary[var_mat]]
|
||||
|
||||
# Turn off render camera and viewport camera for all collections in hierarchy
|
||||
for i in hierarchy:
|
||||
for j in hierarchy[i]:
|
||||
try:
|
||||
bpy.data.collections[j].hide_render = True
|
||||
bpy.data.collections[j].hide_viewport = True
|
||||
except KeyError:
|
||||
raise TypeError(
|
||||
f"\n{bcolors.ERROR}Blend_My_NFTs Error:\n"
|
||||
f"The Collection '{j}' appears to be missing or has been renamed. If you made any changes to "
|
||||
f"your .blned file scene, ensure you re-create your NFT Data so Blend_My_NFTs can read your scene."
|
||||
f"For more information see:{bcolors.RESET}"
|
||||
f"\nhttps://github.com/torrinworx/Blend_My_NFTs#blender-file-organization-and-structure\n"
|
||||
)
|
||||
|
||||
dnaDictionary = match_DNA_to_Variant(single_dna)
|
||||
name = input.nftName + "_" + str(Order_Num)
|
||||
|
||||
# Change Text Object in Scene to match DNA string:
|
||||
# Variables that can be used: full_single_dna, name, Order_Num
|
||||
# ob = bpy.data.objects['Text'] # Object name
|
||||
# ob.data.body = str(f"DNA: {full_single_dna}") # Set text of Text Object ob
|
||||
|
||||
print(f"\n{bcolors.OK}|--- Generating NFT {x}/{NFTs_in_Batch}: {name} ---|{bcolors.RESET}")
|
||||
print(f"DNA attribute list:\n{dnaDictionary}\nDNA Code:{single_dna}")
|
||||
|
||||
for c in dnaDictionary:
|
||||
|
@ -141,22 +247,26 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
bpy.data.collections[collection].hide_render = False
|
||||
bpy.data.collections[collection].hide_viewport = False
|
||||
|
||||
|
||||
time_start_2 = time.time()
|
||||
|
||||
batchFolder = os.path.join(nftBatch_save_path, "Batch" + str(batchToGenerate))
|
||||
|
||||
imagePath = os.path.join(batchFolder, "Images", name)
|
||||
animationPath = os.path.join(batchFolder, "Animations", name)
|
||||
modelPath = os.path.join(batchFolder, "Models", name)
|
||||
# Main paths for batch subfolders:
|
||||
batchFolder = os.path.join(input.nftBatch_save_path, "Batch" + str(input.batchToGenerate))
|
||||
|
||||
imageFolder = os.path.join(batchFolder, "Images")
|
||||
animationFolder = os.path.join(batchFolder, "Animations")
|
||||
modelFolder = os.path.join(batchFolder, "Models")
|
||||
metaDataFolder = os.path.join(batchFolder, "BMNFT_metaData")
|
||||
BMNFT_metaData_Folder = os.path.join(batchFolder, "BMNFT_metadata")
|
||||
|
||||
imagePath = os.path.join(imageFolder, name)
|
||||
animationPath = os.path.join(animationFolder, name)
|
||||
modelPath = os.path.join(modelFolder, name)
|
||||
|
||||
cardanoMetadataPath = os.path.join(batchFolder, "Cardano_metadata")
|
||||
solanaMetadataPath = os.path.join(batchFolder, "Solana_metadata")
|
||||
erc721MetadataPath = os.path.join(batchFolder, "Erc721_metadata")
|
||||
|
||||
# Generation/Rendering:
|
||||
if enableImages:
|
||||
if input.enableImages:
|
||||
print(f"{bcolors.OK}---Image---{bcolors.RESET}")
|
||||
|
||||
image_render_time_start = time.time()
|
||||
|
@ -166,7 +276,7 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
os.makedirs(imageFolder)
|
||||
|
||||
bpy.context.scene.render.filepath = imagePath
|
||||
bpy.context.scene.render.image_settings.file_format = imageFileFormat
|
||||
bpy.context.scene.render.image_settings.file_format = input.imageFileFormat
|
||||
bpy.ops.render.render(write_still=True)
|
||||
|
||||
# Loading Animation:
|
||||
|
@ -180,7 +290,7 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
f"{bcolors.OK}Rendered image in {image_render_time_end - image_render_time_start}s.\n{bcolors.RESET}"
|
||||
)
|
||||
|
||||
if enableAnimations:
|
||||
if input.enableAnimations:
|
||||
print(f"{bcolors.OK}---Animation---{bcolors.RESET}")
|
||||
|
||||
animation_render_time_start = time.time()
|
||||
|
@ -189,17 +299,33 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
if not os.path.exists(animationFolder):
|
||||
os.makedirs(animationFolder)
|
||||
|
||||
bpy.context.scene.render.filepath = animationPath
|
||||
|
||||
if animationFileFormat == 'MP4':
|
||||
if input.animationFileFormat == "MP4":
|
||||
bpy.context.scene.render.filepath = animationPath
|
||||
bpy.context.scene.render.image_settings.file_format = "FFMPEG"
|
||||
|
||||
bpy.context.scene.render.ffmpeg.format = 'MPEG4'
|
||||
bpy.context.scene.render.ffmpeg.codec = 'H264'
|
||||
bpy.ops.render.render(animation=True)
|
||||
|
||||
elif input.animationFileFormat == 'PNG':
|
||||
if not os.path.exists(animationPath):
|
||||
os.makedirs(animationPath)
|
||||
|
||||
bpy.context.scene.render.filepath = os.path.join(animationPath, name)
|
||||
bpy.context.scene.render.image_settings.file_format = input.animationFileFormat
|
||||
bpy.ops.render.render(animation=True)
|
||||
|
||||
elif input.animationFileFormat == 'TIFF':
|
||||
if not os.path.exists(animationPath):
|
||||
os.makedirs(animationPath)
|
||||
|
||||
bpy.context.scene.render.filepath = os.path.join(animationPath, name)
|
||||
bpy.context.scene.render.image_settings.file_format = input.animationFileFormat
|
||||
bpy.ops.render.render(animation=True)
|
||||
|
||||
else:
|
||||
bpy.context.scene.render.image_settings.file_format = animationFileFormat
|
||||
bpy.context.scene.render.filepath = animationPath
|
||||
bpy.context.scene.render.image_settings.file_format = input.animationFileFormat
|
||||
bpy.ops.render.render(animation=True)
|
||||
|
||||
# Loading Animation:
|
||||
|
@ -213,7 +339,7 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
f"{bcolors.OK}Rendered animation in {animation_render_time_end - animation_render_time_start}s.\n{bcolors.RESET}"
|
||||
)
|
||||
|
||||
if enableModelsBlender:
|
||||
if input.enableModelsBlender:
|
||||
print(f"{bcolors.OK}---3D Model---{bcolors.RESET}")
|
||||
|
||||
model_generation_time_start = time.time()
|
||||
|
@ -231,38 +357,38 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
for obj in bpy.data.collections['Script_Ignore'].all_objects:
|
||||
obj.select_set(True)
|
||||
|
||||
if modelFileFormat == 'GLB':
|
||||
if input.modelFileFormat == 'GLB':
|
||||
bpy.ops.export_scene.gltf(filepath=f"{modelPath}.glb",
|
||||
check_existing=True,
|
||||
export_format='GLB',
|
||||
use_selection=True)
|
||||
if modelFileFormat == 'GLTF_SEPARATE':
|
||||
if input.modelFileFormat == 'GLTF_SEPARATE':
|
||||
bpy.ops.export_scene.gltf(filepath=f"{modelPath}",
|
||||
check_existing=True,
|
||||
export_format='GLTF_SEPARATE',
|
||||
use_selection=True)
|
||||
if modelFileFormat == 'GLTF_EMBEDDED':
|
||||
if input.modelFileFormat == 'GLTF_EMBEDDED':
|
||||
bpy.ops.export_scene.gltf(filepath=f"{modelPath}.gltf",
|
||||
check_existing=True,
|
||||
export_format='GLTF_EMBEDDED',
|
||||
use_selection=True)
|
||||
elif modelFileFormat == 'FBX':
|
||||
elif input.modelFileFormat == 'FBX':
|
||||
bpy.ops.export_scene.fbx(filepath=f"{modelPath}.fbx",
|
||||
check_existing=True,
|
||||
use_selection=True)
|
||||
elif modelFileFormat == 'OBJ':
|
||||
elif input.modelFileFormat == 'OBJ':
|
||||
bpy.ops.export_scene.obj(filepath=f"{modelPath}.obj",
|
||||
check_existing=True,
|
||||
use_selection=True, )
|
||||
elif modelFileFormat == 'X3D':
|
||||
elif input.modelFileFormat == 'X3D':
|
||||
bpy.ops.export_scene.x3d(filepath=f"{modelPath}.x3d",
|
||||
check_existing=True,
|
||||
use_selection=True)
|
||||
elif modelFileFormat == 'STL':
|
||||
elif input.modelFileFormat == 'STL':
|
||||
bpy.ops.export_mesh.stl(filepath=f"{modelPath}.stl",
|
||||
check_existing=True,
|
||||
use_selection=True)
|
||||
elif modelFileFormat == 'VOX':
|
||||
elif input.modelFileFormat == 'VOX':
|
||||
bpy.ops.export_vox.some_data(filepath=f"{modelPath}.vox")
|
||||
|
||||
# Loading Animation:
|
||||
|
@ -276,23 +402,43 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
f"{bcolors.OK}Generated model in {model_generation_time_end - model_generation_time_start}s.\n{bcolors.RESET}"
|
||||
)
|
||||
|
||||
if not os.path.exists(metaDataFolder):
|
||||
os.makedirs(metaDataFolder)
|
||||
# Generating Metadata:
|
||||
if input.cardanoMetaDataBool:
|
||||
if not os.path.exists(cardanoMetadataPath):
|
||||
os.makedirs(cardanoMetadataPath)
|
||||
createCardanoMetadata(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, input.custom_Fields,
|
||||
input.enableCustomFields, input.cardano_description, cardanoMetadataPath)
|
||||
|
||||
if input.solanaMetaDataBool:
|
||||
if not os.path.exists(solanaMetadataPath):
|
||||
os.makedirs(solanaMetadataPath)
|
||||
createSolanaMetaData(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, input.custom_Fields,
|
||||
input.enableCustomFields, input.cardano_description, solanaMetadataPath)
|
||||
|
||||
if input.erc721MetaData:
|
||||
if not os.path.exists(erc721MetadataPath):
|
||||
os.makedirs(erc721MetadataPath)
|
||||
createErc721MetaData(name, Order_Num, full_single_dna, dnaDictionary, metadataMaterialDict, input.custom_Fields,
|
||||
input.enableCustomFields, input.cardano_description, erc721MetadataPath)
|
||||
|
||||
if not os.path.exists(BMNFT_metaData_Folder):
|
||||
os.makedirs(BMNFT_metaData_Folder)
|
||||
|
||||
for b in dnaDictionary:
|
||||
if dnaDictionary[b] == "0":
|
||||
dnaDictionary[b] = "Empty"
|
||||
|
||||
metaDataDict = {"name": name, "NFT_DNA": a, "NFT_Variants": dnaDictionary}
|
||||
metaDataDict = {"name": name, "NFT_DNA": a, "NFT_Variants": dnaDictionary,
|
||||
"Material_Attributes": metadataMaterialDict}
|
||||
|
||||
jsonMetaData = json.dumps(metaDataDict, indent=1, ensure_ascii=True)
|
||||
|
||||
with open(os.path.join(metaDataFolder, "Data_" + name + ".json"), 'w') as outfile:
|
||||
with open(os.path.join(BMNFT_metaData_Folder, "Data_" + name + ".json"), 'w') as outfile:
|
||||
outfile.write(jsonMetaData + '\n')
|
||||
|
||||
print(f"Completed {name} render in {time.time() - time_start_2}s")
|
||||
|
||||
save_completed(single_dna, a, x, batch_json_save_path, batchToGenerate)
|
||||
save_completed(full_single_dna, a, x, input.batch_json_save_path, input.batchToGenerate)
|
||||
|
||||
x += 1
|
||||
|
||||
|
@ -301,5 +447,13 @@ def render_and_save_NFTs(nftName, maxNFTs, batchToGenerate, batch_json_save_path
|
|||
bpy.data.collections[j].hide_render = False
|
||||
bpy.data.collections[j].hide_viewport = False
|
||||
|
||||
print(f"\nAll NFTs successfully generated and sent to {nftBatch_save_path}"
|
||||
f"\nCompleted all renders in Batch{batchToGenerate}.json in {time.time() - time_start_1}s\n")
|
||||
batch_complete_time = time.time() - time_start_1
|
||||
|
||||
print(f"\nAll NFTs successfully generated and sent to {input.nftBatch_save_path}"
|
||||
f"\nCompleted all renders in Batch{input.batchToGenerate}.json in {batch_complete_time}s\n")
|
||||
|
||||
batch_info = {"Batch Render Time": batch_complete_time, "Number of NFTs generated in Batch": x - 1,
|
||||
"Average time per generation": batch_complete_time / x - 1}
|
||||
|
||||
batch_infoFolder = os.path.join(input.nftBatch_save_path, "Batch" + str(input.batchToGenerate), "batch_info.json")
|
||||
save_batch(batch_info, batch_infoFolder)
|
||||
|
|
|
@ -155,7 +155,6 @@ def always_with_Rule_Check(hierarchy, deconstructed_DNA, num_List1, num_List2):
|
|||
# Main Function
|
||||
def logicafyDNAsingle(hierarchy, singleDNA, logicFile):
|
||||
|
||||
logicFile = json.load(open(logicFile))
|
||||
deconstructed_DNA = singleDNA.split("-")
|
||||
|
||||
didReconstruct = True
|
||||
|
|
|
@ -0,0 +1,100 @@
|
|||
# Purpose:
|
||||
# The purpose of this file is to apply the materials a user sets in a given .json file to the Variant collection objects
|
||||
# also specified in the .json file. The Materialized DNA is then returned in the following format: 1-1-1:1-1-1
|
||||
# Where the numbers right of the ":" are the material numbers applied to the respective Variants to the left of the ":"
|
||||
|
||||
import bpy
|
||||
|
||||
import json
|
||||
import random
|
||||
|
||||
|
||||
def select_material(materialList):
|
||||
"""Selects a material from a passed material list. """
|
||||
|
||||
number_List_Of_i = []
|
||||
rarity_List_Of_i = []
|
||||
ifZeroBool = None
|
||||
|
||||
for material in materialList:
|
||||
|
||||
material_order_num = material.split("_")[1]
|
||||
number_List_Of_i.append(material_order_num)
|
||||
|
||||
material_rarity_percent = material.split("_")[1]
|
||||
rarity_List_Of_i.append(float(material_rarity_percent))
|
||||
|
||||
for x in rarity_List_Of_i:
|
||||
if x == 0:
|
||||
ifZeroBool = True
|
||||
break
|
||||
elif x != 0:
|
||||
ifZeroBool = False
|
||||
|
||||
if ifZeroBool:
|
||||
selected_material = random.choices(number_List_Of_i, k=1)
|
||||
elif not ifZeroBool:
|
||||
selected_material = random.choices(number_List_Of_i, weights=rarity_List_Of_i, k=1)
|
||||
|
||||
return selected_material[0]
|
||||
|
||||
def get_variant_att_index(variant, hierarchy):
|
||||
variant_attribute = None
|
||||
|
||||
for attribute in hierarchy:
|
||||
for variant_h in hierarchy[attribute]:
|
||||
if variant_h == variant:
|
||||
variant_attribute = attribute
|
||||
|
||||
attribute_index = list(hierarchy.keys()).index(variant_attribute)
|
||||
variant_order_num = variant.split("_")[1]
|
||||
return attribute_index, variant_order_num
|
||||
|
||||
def match_DNA_to_Variant(hierarchy, singleDNA):
|
||||
"""
|
||||
Matches each DNA number separated by "-" to its attribute, then its variant.
|
||||
"""
|
||||
|
||||
listAttributes = list(hierarchy.keys())
|
||||
listDnaDecunstructed = singleDNA.split('-')
|
||||
dnaDictionary = {}
|
||||
|
||||
for i, j in zip(listAttributes, listDnaDecunstructed):
|
||||
dnaDictionary[i] = j
|
||||
|
||||
for x in dnaDictionary:
|
||||
for k in hierarchy[x]:
|
||||
kNum = hierarchy[x][k]["number"]
|
||||
if kNum == dnaDictionary[x]:
|
||||
dnaDictionary.update({x: k})
|
||||
return dnaDictionary
|
||||
|
||||
def apply_materials(hierarchy, singleDNA, materialsFile):
|
||||
"""
|
||||
DNA with applied material example: "1-1:1-1" <Normal DNA>:<Selected Material for each Variant>
|
||||
|
||||
The Material DNA will select the material for the Variant order number in the NFT DNA based on the Variant Material
|
||||
list in the Variant_Material.json file.
|
||||
"""
|
||||
|
||||
singleDNADict = match_DNA_to_Variant(hierarchy, singleDNA)
|
||||
materialsFile = json.load(open(materialsFile))
|
||||
deconstructed_MaterialDNA = {}
|
||||
|
||||
for a in singleDNADict:
|
||||
complete = False
|
||||
for b in materialsFile:
|
||||
if singleDNADict[a] == b:
|
||||
mat = select_material(materialsFile[b]['Material List'])
|
||||
deconstructed_MaterialDNA[a] = mat
|
||||
complete = True
|
||||
if not complete:
|
||||
deconstructed_MaterialDNA[a] = "0"
|
||||
|
||||
material_DNA = ""
|
||||
for a in deconstructed_MaterialDNA:
|
||||
num = "-" + str(deconstructed_MaterialDNA[a])
|
||||
material_DNA += num
|
||||
material_DNA = ''.join(material_DNA.split('-', 1))
|
||||
|
||||
return f"{singleDNA}:{material_DNA}"
|
|
@ -6,10 +6,22 @@
|
|||
# This file returns the specified meta data format to the Exporter.py for a given NFT DNA.
|
||||
|
||||
import bpy
|
||||
import os
|
||||
import json
|
||||
|
||||
def sendMetaDataToJson(metaDataDict, save_path, file_name):
|
||||
jsonMetaData = json.dumps(metaDataDict, indent=1, ensure_ascii=True)
|
||||
with open(os.path.join(save_path, f"{file_name}.json"), 'w') as outfile:
|
||||
outfile.write(jsonMetaData + '\n')
|
||||
|
||||
def stripNums(variant):
|
||||
variant = str(variant).split('_')[0]
|
||||
return variant
|
||||
|
||||
# Cardano Template
|
||||
def returnCardanoMetaData(name, NFT_DNA, NFT_Variants, custom_Fields_File, enableCustomFields, cardano_description):
|
||||
def createCardanoMetadata(name, Order_Num, NFT_DNA, NFT_Variants, Material_Attributes,
|
||||
custom_Fields, enableCustomFields, cardano_description, cardanoMetadataPath):
|
||||
|
||||
metaDataDictCardano = {"721": {
|
||||
"<policy_id>": {
|
||||
name: {
|
||||
|
@ -22,35 +34,48 @@ def returnCardanoMetaData(name, NFT_DNA, NFT_Variants, custom_Fields_File, enabl
|
|||
"version": "1.0"
|
||||
}}
|
||||
|
||||
# Variants and Attributes:
|
||||
for i in NFT_Variants:
|
||||
metaDataDictCardano["721"]["<policy_id>"][name][i] = NFT_Variants[i]
|
||||
metaDataDictCardano["721"]["<policy_id>"][name][i] = stripNums(NFT_Variants[i])
|
||||
|
||||
# Material Variants and Attributes:
|
||||
for i in Material_Attributes:
|
||||
metaDataDictCardano["721"]["<policy_id>"][name][i] = Material_Attributes[i]
|
||||
|
||||
# Custom Fields:
|
||||
if enableCustomFields:
|
||||
custom_Fields = json.load(open(custom_Fields_File))
|
||||
for i in custom_Fields:
|
||||
metaDataDictCardano["721"]["<policy_id>"][name][i] = custom_Fields[i]
|
||||
|
||||
return metaDataDictCardano
|
||||
sendMetaDataToJson(metaDataDictCardano, cardanoMetadataPath, name)
|
||||
|
||||
|
||||
# Solana Template
|
||||
def returnSolanaMetaData(name, NFT_DNA, NFT_Variants, custom_Fields_File, enableCustomFields, solana_description):
|
||||
def createSolanaMetaData(name, Order_Num, NFT_DNA, NFT_Variants, Material_Attributes, custom_Fields, enableCustomFields,
|
||||
solana_description, solanaMetadataPath):
|
||||
metaDataDictSolana = {"name": name, "symbol": "", "description": solana_description, "seller_fee_basis_points": None,
|
||||
"image": "", "animation_url": "", "external_url": ""}
|
||||
|
||||
attributes = []
|
||||
|
||||
# Variant and Attributes:
|
||||
for i in NFT_Variants:
|
||||
dictionary = {
|
||||
"trait_type": i,
|
||||
"value": NFT_Variants[i]
|
||||
"value": stripNums(NFT_Variants[i])
|
||||
}
|
||||
attributes.append(dictionary)
|
||||
|
||||
# Material Variants and Attributes:
|
||||
for i in Material_Attributes:
|
||||
dictionary = {
|
||||
"trait_type": i,
|
||||
"value": Material_Attributes[i]
|
||||
}
|
||||
attributes.append(dictionary)
|
||||
|
||||
# Custom Fields:
|
||||
if enableCustomFields:
|
||||
custom_Fields = json.load(open(custom_Fields_File))
|
||||
for i in custom_Fields:
|
||||
dictionary = {
|
||||
"trait_type": i,
|
||||
|
@ -69,10 +94,13 @@ def returnSolanaMetaData(name, NFT_DNA, NFT_Variants, custom_Fields_File, enable
|
|||
"category": "",
|
||||
"creators": [{"address": "", "share": None}]
|
||||
}
|
||||
return metaDataDictSolana
|
||||
|
||||
sendMetaDataToJson(metaDataDictSolana, solanaMetadataPath, name)
|
||||
|
||||
|
||||
# ERC721 Template
|
||||
def returnErc721MetaData(name, NFT_DNA, NFT_Variants, custom_Fields_File, enableCustomFields, erc721_description):
|
||||
def createErc721MetaData(name, Order_Num, NFT_DNA, NFT_Variants, Material_Attributes, custom_Fields, enableCustomFields,
|
||||
erc721_description, erc721MetadataPath):
|
||||
metaDataDictErc721 = {
|
||||
"name": name,
|
||||
"description": erc721_description,
|
||||
|
@ -82,17 +110,26 @@ def returnErc721MetaData(name, NFT_DNA, NFT_Variants, custom_Fields_File, enable
|
|||
|
||||
attributes = []
|
||||
|
||||
# Variants and Attributes:
|
||||
for i in NFT_Variants:
|
||||
dictionary = {
|
||||
"trait_type": i,
|
||||
"value": NFT_Variants[i]
|
||||
"value": stripNums(NFT_Variants[i])
|
||||
}
|
||||
|
||||
attributes.append(dictionary)
|
||||
|
||||
# Material Variants and Attributes:
|
||||
for i in Material_Attributes:
|
||||
dictionary = {
|
||||
"trait_type": i,
|
||||
"value": Material_Attributes[i]
|
||||
}
|
||||
|
||||
attributes.append(dictionary)
|
||||
|
||||
# Custom Fields:
|
||||
if enableCustomFields:
|
||||
custom_Fields = json.load(open(custom_Fields_File))
|
||||
for i in custom_Fields:
|
||||
dictionary = {
|
||||
"trait_type": i,
|
||||
|
@ -102,4 +139,5 @@ def returnErc721MetaData(name, NFT_DNA, NFT_Variants, custom_Fields_File, enable
|
|||
|
||||
metaDataDictErc721["attributes"] = attributes
|
||||
|
||||
return metaDataDictErc721
|
||||
sendMetaDataToJson(metaDataDictErc721, erc721MetadataPath, name)
|
||||
|
||||
|
|
|
@ -3,228 +3,46 @@
|
|||
|
||||
import bpy
|
||||
import os
|
||||
import copy
|
||||
import json
|
||||
import shutil
|
||||
from . import Metadata
|
||||
|
||||
from .Constants import bcolors, removeList, remove_file_by_extension
|
||||
|
||||
|
||||
def getNFType(nftBatch_save_path):
|
||||
images = False
|
||||
animations = False
|
||||
models = False
|
||||
metaData = False
|
||||
|
||||
batch1 = sorted(remove_file_by_extension(os.listdir(nftBatch_save_path)))[0]
|
||||
batchContent = remove_file_by_extension(os.listdir(os.path.join(nftBatch_save_path, batch1)))
|
||||
|
||||
if "Images" in batchContent:
|
||||
images = True
|
||||
if "Animations" in batchContent:
|
||||
animations = True
|
||||
if "Models" in batchContent:
|
||||
models = True
|
||||
if "BMNFT_metaData" in batchContent:
|
||||
metaData = True
|
||||
|
||||
return images, animations, models, metaData
|
||||
|
||||
def getMetaDataDirty(completeMetaDataPath, i):
|
||||
"""
|
||||
Retrieves a given batches data determined by renderBatch in config.py
|
||||
"""
|
||||
|
||||
file_name = os.path.join(completeMetaDataPath, i)
|
||||
metaDataDirty = json.load(open(file_name))
|
||||
|
||||
name = metaDataDirty["name"]
|
||||
NFT_DNA = metaDataDirty["NFT_DNA"]
|
||||
NFT_Variants = metaDataDirty["NFT_Variants"]
|
||||
|
||||
for i in NFT_Variants:
|
||||
x = NFT_Variants[i]
|
||||
NFT_Variants[i] = x.split("_")[0]
|
||||
|
||||
return name, NFT_DNA, NFT_Variants
|
||||
|
||||
def sendMetaDataToJson(metaDataDict, metaDataPath, jsonName):
|
||||
jsonMetaData = json.dumps(metaDataDict, indent=1, ensure_ascii=True)
|
||||
with open(os.path.join(metaDataPath, jsonName), 'w') as outfile:
|
||||
outfile.write(jsonMetaData + '\n')
|
||||
|
||||
def renameMetaData(rename_MetaData_Variables):
|
||||
metaDataListOld = os.listdir(rename_MetaData_Variables.completeMetaDataPath)
|
||||
cardanoMetaDataPath = os.path.join(rename_MetaData_Variables.completeCollPath, "Cardano_metaData")
|
||||
solanaMetaDataPath = os.path.join(rename_MetaData_Variables.completeCollPath, "Solana_metaData")
|
||||
erc721MetaDataPath = os.path.join(rename_MetaData_Variables.completeCollPath, "Erc721_metaData")
|
||||
|
||||
for i in metaDataListOld:
|
||||
name, NFT_DNA, NFT_Variants = getMetaDataDirty(rename_MetaData_Variables.completeMetaDataPath, i)
|
||||
|
||||
file_name = os.path.splitext(i)[0]
|
||||
file_num = file_name.split("_")[1]
|
||||
|
||||
if rename_MetaData_Variables.cardanoMetaDataBool:
|
||||
if not os.path.exists(cardanoMetaDataPath):
|
||||
os.mkdir(cardanoMetaDataPath)
|
||||
|
||||
cardanoJsonNew = "Cardano_" + i
|
||||
cardanoNewName = name.split("_")[0] + "_" + str(file_num)
|
||||
|
||||
metaDataDictCardano = Metadata.returnCardanoMetaData(cardanoNewName, NFT_DNA, NFT_Variants, rename_MetaData_Variables.custom_Fields_File, rename_MetaData_Variables.enableCustomFields, rename_MetaData_Variables.cardano_description)
|
||||
|
||||
sendMetaDataToJson(metaDataDictCardano, cardanoMetaDataPath, cardanoJsonNew,)
|
||||
|
||||
if rename_MetaData_Variables.solanaMetaDataBool:
|
||||
if not os.path.exists(solanaMetaDataPath):
|
||||
os.mkdir(solanaMetaDataPath)
|
||||
|
||||
solanaJsonNew = "Solana_" + i
|
||||
solanaNewName = name.split("_")[0] + "_" + str(file_num)
|
||||
|
||||
metaDataDictSolana = Metadata.returnSolanaMetaData(solanaNewName, NFT_DNA, NFT_Variants, rename_MetaData_Variables.custom_Fields_File, rename_MetaData_Variables.enableCustomFields, rename_MetaData_Variables.solana_description)
|
||||
|
||||
sendMetaDataToJson(metaDataDictSolana, solanaMetaDataPath, solanaJsonNew)
|
||||
|
||||
if rename_MetaData_Variables.erc721MetaData:
|
||||
if not os.path.exists(erc721MetaDataPath):
|
||||
os.mkdir(erc721MetaDataPath)
|
||||
|
||||
erc721JsonNew = "Erc721_" + i
|
||||
erc721NewName = name.split("_")[0] + "_" + str(file_num)
|
||||
|
||||
metaDataDictErc721 = Metadata.returnErc721MetaData(erc721NewName, NFT_DNA, NFT_Variants, rename_MetaData_Variables.custom_Fields_File, rename_MetaData_Variables.enableCustomFields, rename_MetaData_Variables.erc721_description)
|
||||
|
||||
sendMetaDataToJson(metaDataDictErc721, erc721MetaDataPath, erc721JsonNew)
|
||||
return
|
||||
|
||||
def reformatNFTCollection(refactor_panel_input):
|
||||
images, animations, models, metaData = getNFType(refactor_panel_input.nftBatch_save_path)
|
||||
|
||||
global completeCollPath
|
||||
global completeMetaDataPath
|
||||
|
||||
completeCollPath = os.path.join(refactor_panel_input.save_path, "Blend_My_NFTs Output", "Complete_Collection")
|
||||
completeImagePath = os.path.join(completeCollPath, "Images")
|
||||
completeAnimationsPath = os.path.join(completeCollPath, "Animations")
|
||||
completeModelsPath = os.path.join(completeCollPath, "Models")
|
||||
completeMetaDataPath = os.path.join(completeCollPath, "BMNFT_metaData")
|
||||
|
||||
if not os.path.exists(completeCollPath):
|
||||
os.mkdir(completeCollPath)
|
||||
if images and not os.path.exists(completeImagePath):
|
||||
os.mkdir(completeImagePath)
|
||||
if animations and not os.path.exists(completeAnimationsPath):
|
||||
os.mkdir(completeAnimationsPath)
|
||||
if models and not os.path.exists(completeModelsPath):
|
||||
os.mkdir(completeModelsPath)
|
||||
if metaData and not os.path.exists(completeMetaDataPath):
|
||||
os.mkdir(completeMetaDataPath)
|
||||
|
||||
batchListDirty = os.listdir(refactor_panel_input.nftBatch_save_path)
|
||||
batchList = [x for x in batchListDirty if (x not in removeList)]
|
||||
batchList = remove_file_by_extension(batchListDirty)
|
||||
collection_info = {"Total Time": 0}
|
||||
|
||||
imageCount = 1
|
||||
animationCount = 1
|
||||
modelCount = 1
|
||||
dataCount = 1
|
||||
for i in batchList:
|
||||
if images:
|
||||
imagesDir = os.path.join(refactor_panel_input.nftBatch_save_path, i, "Images")
|
||||
imagesList = sorted(os.listdir(imagesDir))
|
||||
for folder in batchList:
|
||||
batch_info = json.load(open(os.path.join(refactor_panel_input.nftBatch_save_path, folder, "batch_info.json")))
|
||||
collection_info[os.path.basename(folder)] = batch_info
|
||||
collection_info["Total Time"] = collection_info["Total Time"] + batch_info["Batch Render Time"]
|
||||
|
||||
for j in imagesList:
|
||||
imageOldPath = os.path.join(refactor_panel_input.nftBatch_save_path, i, "Images", j)
|
||||
nameOldDirty = copy.deepcopy(os.path.splitext(j)[0])
|
||||
extension = copy.deepcopy(os.path.splitext(j)[1])
|
||||
nameOldClean = nameOldDirty.split("_")[0]
|
||||
fileListDirty = os.listdir(os.path.join(refactor_panel_input.nftBatch_save_path, folder))
|
||||
filelist = remove_file_by_extension(fileListDirty)
|
||||
|
||||
nameNew = nameOldClean + "_" + str(imageCount)
|
||||
imageNewPath = os.path.join(completeImagePath, nameNew + extension)
|
||||
for mediaTypeFolder in filelist:
|
||||
if mediaTypeFolder != "batch_info.json":
|
||||
mediaTypeFolderDir = os.path.join(refactor_panel_input.nftBatch_save_path, folder, mediaTypeFolder)
|
||||
|
||||
os.rename(imageOldPath, imageNewPath)
|
||||
for i in os.listdir(mediaTypeFolderDir):
|
||||
destination = os.path.join(completeCollPath, mediaTypeFolder)
|
||||
if not os.path.exists(destination):
|
||||
os.makedirs(destination)
|
||||
|
||||
imageCount += 1
|
||||
shutil.move(os.path.join(mediaTypeFolderDir, i), destination)
|
||||
|
||||
if animations:
|
||||
animationsDir = os.path.join(refactor_panel_input.nftBatch_save_path, i, "Animations")
|
||||
animationsList = sorted(os.listdir(animationsDir))
|
||||
|
||||
for j in animationsList:
|
||||
animationOldPath = os.path.join(refactor_panel_input.nftBatch_save_path, i, "Animations", j)
|
||||
nameOldDirty = copy.deepcopy(os.path.splitext(j)[0])
|
||||
extension = copy.deepcopy(os.path.splitext(j)[1])
|
||||
nameOldClean = nameOldDirty.split("_")[0]
|
||||
|
||||
nameNew = nameOldClean + "_" + str(animationCount)
|
||||
animationNewPath = os.path.join(completeAnimationsPath, nameNew + extension)
|
||||
|
||||
os.rename(animationOldPath, animationNewPath)
|
||||
|
||||
animationCount += 1
|
||||
|
||||
if models:
|
||||
modelsDir = os.path.join(refactor_panel_input.nftBatch_save_path, i, "Models")
|
||||
modelsList = sorted(os.listdir(modelsDir))
|
||||
|
||||
for j in modelsList:
|
||||
modelOldPath = os.path.join(refactor_panel_input.nftBatch_save_path, i, "Models", j)
|
||||
nameOldDirty = copy.deepcopy(os.path.splitext(j)[0])
|
||||
extension = copy.deepcopy(os.path.splitext(j)[1])
|
||||
nameOldClean = nameOldDirty.split("_")[0]
|
||||
|
||||
nameNew = nameOldClean + "_" + str(modelCount)
|
||||
modelsNewPath = os.path.join(completeModelsPath, nameNew + extension)
|
||||
|
||||
os.rename(modelOldPath, modelsNewPath)
|
||||
|
||||
modelCount += 1
|
||||
|
||||
if metaData:
|
||||
dataDir = os.path.join(refactor_panel_input.nftBatch_save_path, i, "BMNFT_metaData")
|
||||
dataList = sorted(os.listdir(dataDir))
|
||||
|
||||
for j in dataList:
|
||||
dataOldPath = os.path.join(refactor_panel_input.nftBatch_save_path, i, "BMNFT_metaData", j)
|
||||
nameOldDirty = copy.deepcopy(os.path.splitext(j)[0])
|
||||
extension = copy.deepcopy(os.path.splitext(j)[1])
|
||||
nameOldClean = nameOldDirty.split("_")[0]
|
||||
|
||||
nameNew = nameOldClean + "_" + str(dataCount)
|
||||
dataNewPath = os.path.join(completeMetaDataPath, nameNew + extension)
|
||||
os.rename(dataOldPath, dataNewPath)
|
||||
|
||||
BMFNT_Meta = json.load(open(dataNewPath))
|
||||
name = BMFNT_Meta["name"].split("_")[0]
|
||||
BMFNT_Meta["name"] = name + "_" + str(dataCount)
|
||||
jsonMetaData = json.dumps(BMFNT_Meta, indent=1, ensure_ascii=True)
|
||||
|
||||
with open(dataNewPath, 'w') as outfile:
|
||||
outfile.write(jsonMetaData + '\n')
|
||||
|
||||
dataCount += 1
|
||||
collection_info = json.dumps(collection_info, indent=1, ensure_ascii=True)
|
||||
with open(os.path.join(completeCollPath, "collection_info.json"), 'w') as outfile:
|
||||
outfile.write(collection_info + '\n')
|
||||
|
||||
print(f"All NFT files stored and sorted to the Complete_Collection folder in {refactor_panel_input.save_path}")
|
||||
|
||||
class rename_MetaData_Variables:
|
||||
completeCollPath = completeCollPath
|
||||
completeMetaDataPath = completeMetaDataPath
|
||||
|
||||
cardanoMetaDataBool = refactor_panel_input.cardanoMetaDataBool
|
||||
solanaMetaDataBool = refactor_panel_input.solanaMetaDataBool
|
||||
erc721MetaData = refactor_panel_input.erc721MetaData
|
||||
|
||||
custom_Fields_File = refactor_panel_input.custom_Fields_File
|
||||
enableCustomFields = refactor_panel_input.enableCustomFields
|
||||
|
||||
|
||||
cardano_description = refactor_panel_input.cardano_description
|
||||
solana_description = refactor_panel_input.solana_description
|
||||
erc721_description = refactor_panel_input.erc721_description
|
||||
|
||||
|
||||
renameMetaData(rename_MetaData_Variables)
|
||||
|
||||
shutil.rmtree(refactor_panel_input.nftBatch_save_path)
|
||||
|
||||
|
|
|
@ -19,7 +19,18 @@ class Loader:
|
|||
self.timeout = timeout
|
||||
|
||||
self._thread = Thread(target=self._animate, daemon=True)
|
||||
self.steps = ["⢿", "⣻", "⣽", "⣾", "⣷", "⣯", "⣟", "⡿"]
|
||||
self.steps = [
|
||||
" [== ]",
|
||||
" [ == ]",
|
||||
" [ == ]",
|
||||
" [ == ]",
|
||||
" [ == ]",
|
||||
" [ ==]",
|
||||
" [ == ]",
|
||||
" [ == ]",
|
||||
" [ == ]",
|
||||
" [ == ]",
|
||||
]
|
||||
self.done = False
|
||||
|
||||
def start(self):
|
||||
|
|
Ładowanie…
Reference in New Issue