There’s lots to comment on with these changes, but have only had limited plays (been sunny here) ... so these'll have to do for now.
Say user selects a 3x2 region of faces on a sphere
Invokes extrude region normal… and gets a single widget (ok, it changes for the custom option … but still one)
User alters distance / inset etc (maybe changes to custom … still loses distance but keeps inset)
Anyway, assuming just used normal + inset
Then uses rotate (region) … sunglasses … can’t cope, so goes back to extrude (normal) …
And now got individual widgets for each face as opposed to how it was before.
… presumably it should revert to the same state as if it was a fresh extrude op?
Still can’t see the need for (full / complete) widget implementation when using any single one does the same job as any other … just clutters up the workspace imo.
(Am assuming the logic / reasoning is to show user the axis / plane for each face .. ie ‘at the point of application’ of the tool – but you’re also asking user to imagine the effect when using a remote axis … as in custom target rotate… I’d have thought just an axis display would’ve been sufficient … if you really must have something for each face + a single global rotate blob?)
Snapping is a big +
… but I don’t see a way of defining an axis between a couple of verts … and then being able to transfer that axis to act thro’ another (user definable point) … imagine a circular hatch on spacecraft where its hinges are not aligned to xyz.
User picks up hinge axis from hatch diameter elements / geom … but then wants to transfer same to correct hinge location?
Maybe some way by which user can click on both widgets and then move them together?
Imo, the second info panel on rhs is just a waste of modelling window real estate … possibly brought on because of splitting various commands into 2 … which I also think is unnecessary (excepting scale, possibly)
I wondered if there’s a case to be made to make the 2 widgets slightly different, so’s user can differentiate between them more easily … and also, when they (initially) appear on top of each other at the origin, its rather confusing as it seems there’s only one.
Have you considered just using a single widget for all options - including target / vector stuff – that allows user to ‘pull out’ (say) the necessary additional functionality when needed … rather than having to access additional stuff via dialog / panels?
Eg For rotate (say) … user Alt clicks on middle blob … and ‘pulls out’ another blob … with ‘trailing string’
This second blob can then be used to define the second point for an axis … with the displayed string being the axis display?
Yes, there’d be no xyz arrows with this for ortho adjustment … but (smaller?) ones could appear as soon as the scouts ship blob is pulled from its mother ship … or, maybe … not have any at all and user again uses mother ship arrows in conjunction with alt (or whatever modifier key) to adjust scout ship blob position only ... if so, could the situation above (moving both points together) also be accomplished in a similar way ... maybe clicking on axis display somehow?
Re the UI – imo there’s already too much wasted space on the single lhs panel … for the info that’s actually being offered to user … let alone having a rhs one too.
Dunno if this is possible, but I was wondering if mid-point snapping on an edge is available some how?
Suitably implemented, this’d offer user a quick / easy way of defining such an axis / vector?
Sorry if above sounds unduly negative … I think there’s a lot of potential here … but it needs ‘teasing out’ … and this is just from my pov, of course … you don’t have to (and prob won't) agree