As promised, I implemented variant swapping functionality. It is entirely data driven, meaning that as the user selects a piece of clothing, the tool searches the appropriate 'sourceimages' subdirectory of the Maya project and adds to the list the name of all folders that contain at least a file called DIFF_folderName.jpg, which validates them as texture options for that piece of clothing. The tool also determines how many color detail masks are stored in that folder and communicates it back to the GUI. This allows the tool to show/hide and enable/disable the color picker options depending on the number of masks available per variation. Some of the UI changes were made after the Usability Tests I conducted earlier today, where I found that a majority of the users had trouble understanding how the colors worked, because I was only disabling them at the time when the active variant did not support them, but the graying out was not very effective to communicate that they were no longer functional. The thing that worried me the most was the fact that the navigation was not clear, and users would normally tweak the parameters on the first screen and then hit the 'Export' button, not realizing the existence of the tabs. For that reason I created to options to change the UI layout and they are currently being voted on by the team. Another change that you might have noticed in the demonstration video was the addition of template saving. I am currently able to store all the attributes that define a character in a small XML file, and will soon implement the functionality for importing that information back in, so that a character design can be iterated upon.
1 Comment
So far this week I have finalized implementing the basic export functionality. I solved last week's texture issue where sometimes the secondary blend colors would bleed into the final texture when running Maya's 'Convert to File Texture'. To address it, I baked the diffuse channels from the layered textures on the torso and the legs separately, use the resulting files to replace the layered texture nodes and then run the 'Convert to File Texture''. This way, the command does not need to worry about nested layered texture and factoring the alpha channel on each of their inputs. The following video demonstrates the export process. The tool looks for a directory in Maya's default project path called 'exports', and if it does not exist, it is created. It generates a unique id for the character using the user's name and a timestamp, with the possibility of adding an optional tag to identify the character. It outputs a final diffuse color map and an FBX with the skinned mesh. As you may notice, there is a change in the way the texture looks from the layered texture setup. If you look at the Maya swatch for the material's color and the final output, the color values match perfectly, but the way the viewport deals with the layered texture and its blending modes produces a sometimes different look from just plugging an equivalent texture. I will try to address this in the following week.
I am currently working on the functionality of the rest of the buttons (Load and Save) in particular, and expect them to be ready sometime tomorrow. In spite of the problems I talked about earlier this week with the implementation of a color picker in Qt Creator, I managed to find a way to nest a colorSliderGrp from Maya/Python in the resulting GUI. The trick was querying the path of a Qt control sitting at the same hierarchical level you want the color picker to be. I had tried this before, but it would appear that Qt Creator (or Maya?) has a max hierarchical depth past which 'Layout' elements cease to be listed in the full path of a widget. I restructured the UI to simplify the addition of the new controllers. With that out of the way, I implemented a first pass of the functionality. When swapping between meshes for the torso and legs, the tool automatically changes the skinMask textures and the diffuse base for each clothing item. It also allows the user to change that base color using a color picker and one of five blending modes (Add, Multiply, Lighten, Darken, and Overlay). While the first four are natively supported by Maya's layered texture, the Overlay was implemented by replicating the math used by Photoshop through a network of utility nodes. The following video shows the current state of the tool. As it stands now, there is a problem where the flattening of the torso's layered texture sometimes introduces artifacts in the final texture.
I will address this problem in the coming sprint, as well as transition from the current model where most paths and options are hard-coded to a more data-driven structure. I have also begun to hold usability tests by letting some of the end users play with the interface. During the rest of the week I will create a more directed version of the test and a companion survey to collect first impressions. I also intend to record both the screen as they play around with the tool and the users themselves to gauge their reactions. Even though the lack of native Qt Creator support of a Color Picker might make me discard this workflow altogether and build my UI directly with Python, in order to keep up with my schedule (and since UI and functionality are independent in my code), I decided to use the UI created in Qt to implement the first pass of the functionality. This consists of being able to swap meshes and control blend shapes through the slider in the interface. A demonstration of the current state can be seen in the next video. Since the current state of the UI lacks the Color Picker to accomplish one of my tasks for this week, I decided to instead address the issue of automating the inclusion of the transparency information as an alpha channel instead of as a separate texture caused by Maya's 'Convert to File Texture' command. The options I had available were:
Since the first two options involved having the end user install additional software and possibly do some troubleshooting to make them work correctly with Python (I had trouble myself with win32com), I opted for the third one. I wrote a Javascript tool that inverts the color information on the red channel of the alpha map that Maya generates, copies that information to a new alpha channel on the color texture and saves it with a new name. The path to the different files is stored in a small XML file that the JSX reads and Python can easily change. The JSX script is called directly from Maya through Python's command line functions. Here is a small clip of the tool in action. To choose the best elements from the UI Mockups that I created last week, I made a survey using Survey Monkey and asked the Focal Length team (producers and artists), as well as the Art and Technical Art faculty at FIEA, to take it. Here are the results. The numbers speak for themselves in almost all of the questions, giving me a very clear idea of how to build the UI prototype. It is important to know that these decisions are by no means final, and constant iteration will continue to happen as the fidelity of the prototypes and mockups get closer to the final tool. The only question in which opinion was divided was that regarding how to display the general settings (gender, weight, age, etc.): around 40% liked mockup A, while 30% liked mockup B, and 23% liked a third unrepresented option. I decided to combine mockups A and B in this pass of the UI and address in the following UX test whether or not icons would improve readabilty. With the data in mind, I built the following widget using Qt Creator: Unfortunately, Qt Creator does not have a ColorPicker widget. I spent some time researching ways to get one working, but the options I have found so far (QColorDialog, QColorTriangle, PyQt4) would require the end user to download and install extra software to run the tool, possibly even build some classes from the command line. This would make for a much worse UX, making it hard to setup for non-technical people (which most of the potential users are). I will still run some more tests to see if I can add the colorSliderGrp to the docked widget after loading the .ui file in Maya, but if those tests prove unsuccessful, I will probably have to rebuild the UI directly with Python. While this solution would make it easier to port to the end users' system, it would be a setback schedule-wise.
|