Image not available

1920x1975

fur normals.jpg

๐Ÿงต Untitled Thread

Anonymous No. 920864

Sup guys. I don't know where else to post this, so I come to you. I'm in pursuit of a cool looking toon fur effect for blender. And the idea I have, is to create a bunch of particles scattered along the creature's body. And all of the individual particles adopt the vertex normal that it's nearest to. Resulting in each strand of fur seamlessly blending with the mesh.

It's kind of hard to explain, so if you have any questions, I'll try to answer them the best I can. But if you know what I mean, then the concept is actually rather simple. I just don't know how to do it. And apparently, nobody else can figure it out either. I've already looked around, and found this guy's question on stack exchange. https://blender.stackexchange.com/questions/261659/how-to-sample-surface-normals-to-hair

He has the same idea. And provides a proof of concept by baking normals into the fur. But what he and I are seeking, is something that doesn't require baking, and works in real time. So please, if you have any ideas on the matter. ANY ideas. I would greatly appreciate it.

Anonymous No. 920868

>>920864
Shouldn't it be possible to use the "Normal Edit" modifier and feed it a separate mesh to use as a reference? I feel like you should be able to do this easily by just putting that modifier on your particle object and then pointing it to whatever your base object is.
It's probably more complex than that, and probably won't work, but it might be worth looking into.

Image not available

1410x656

Normal Test 01.png

Anonymous No. 920875

>>920868
I'm not sure that how that would work, but I'm testing it anyway.
I created new objects to test this on. One a torus, and one a little 3 sided cone that I'll call "spike". I Gave them a default material, both the same. And then I lit them under a sun light. I snapped the little spike flush to one face of the torus. Applied scaling too, if that matters. If the experiment goes right, then the spike should be the same shade as the face it's attached too. Effectively blending in.

After poking around with normal edit, I haven't come to anything. Got the sides of the spike to change, but not in the desired way. I barely know how normal edit works. The difference between radial and parallel and directional are all unfamiliar to me. I read the manual, but I'm still not understanding it. From what I'm barely gathering, is that *none* of these options will cause the spike to adopt the nearest normal of the torus. Could be wrong. But I dunno.

I moved the origin of the spike to the same location as the origin of the torus. That made the normals change. They didn't change how I wanted, but they changed.

Anonymous No. 920876

4 lines of code in houdini:

int hitprim;
vector hituv;
xyzdist(1, v@P, hitprim, hituv);
v@N = primuv(1, "N", hitprim, hituv);

or you just get them for free from a scatter node

i don't use blender but:
https://docs.blender.org/manual/en/latest/modeling/geometry_nodes/point/distribute_points_on_faces.html

this will scatter points on a surface and inherit normals

alternatively:

https://docs.blender.org/manual/en/latest/modeling/geometry_nodes/geometry/geometry_proximity.html

this will find the nearest position on target geo

i think this is the node you need to pair it with to pull attributes from target geo:
https://docs.blender.org/manual/en/latest/modeling/geometry_nodes/attribute/transfer_attribute.html

but like i said, i don't use blender because my brain is very, very large.

Image not available

1742x847

Normal Test 02.png

Anonymous No. 920881

>>920876
I'm going to sleep on this.
I have played with all of those nodes before. But I don't really get how to arrange them in such a way to get the results I'm looking for. In any case, I'm too tired to think. So I'll come back to it tomorrow. If you have any more insight, let me know.

Anonymous No. 920886

>>920881
1. i don't use blender
2. i don't know know how instancing for geo nodes works
3. from looking at your result it seems the instancer is actually using the surface normal to orient copies (surface normal looks like the Z direction of the copy). do you want this to happen?
4. i don't know how attribute inheritance works for instanced copies. look into this, if there's just a built in way to inherit attributes from the points it's going to be the fastest way.
5. if there is no automatic way to do this, then for each of the instanced copies you need to find the nearest point on the torus surface (geo proximity node), and from that point transfer the normal to your instance geo (transfer attribute). before doing this however, you may need to use this, because blender may be 'packing' geometry before instancing it, this will unpack it:
https://docs.blender.org/manual/en/latest/modeling/geometry_nodes/instances/realize_instances.html

all that said:
1. i don't use blender
2. i don't vidya shit, and i don't know about this normal hacking stuff. to me this is an orientation problem, but vidya eople have bigger brains than me.

Anonymous No. 921029

>>920875
In old Blender (say, 279) we'd do this with the Data Transfer modifier, which lets you copy attributes (like interpolated-face-normal, what you're after) from one mesh to another based on the nearest faces.

Haven't tried geo nodes yet, the equivalent seems to be this:
https://docs.blender.org/manual/en/latest/modeling/geometry_nodes/attribute/transfer_attribute.html

Do post if you solve this, eh!

Anonymous No. 921038

>>921029
...this video seems relevant but I'm not gonna do your homework for you. Check the first part, "basic proximity transfer":
https://www.youtube.com/watch?v=vbnR97sqRB8

Image not available

1871x874

Normal Test 03.png

Anonymous No. 921057

>>920886
>do you want this to happen?
That appears to be the case. Because I have the rotation from [Distribute Points on Faces], plugged into the rotation of [Instance on Points]. Without that connection, they're all oriented to the world's Z. Meaning they're all straight up. But with the connection, they're oriented to the Z of the normals like you say. I just did that because it looks cooler. The orientation of the instance isn't that important to me right now. I can play with that later.

>i don't know how attribute inheritance works for instanced copies. look into this
Fugg. Where do I even begin looking for something like that? Realizing instances seems to be the best idea. But what do I do after realize instances? They're their own individual geometry now. But I don't see how to tell the the normals of the torus to extend onto the instances.

>and from that point transfer the normal to your instance geo (transfer attribute)
The problem I'm running into, is that there doesn't appear to be any node to receive the normals. I'm like 90% sure I can grab them, and use them to do pointless shit like orient the instances. But how do I tell Blender that I want the instances to inherit the normals?

I feel like such an idiot. Been starring at nodes the last two days, and still I have nothing.

>>921029
I'm going to try data transfer next.

>>921038
I found this video(or video series rather) a while back. I didn't watch the whole thing, because the guy goes full autismal and explains every step in excruciating detail. But I did skim through the videos. And from what I gathered, his technique is limited because he has to create separate plains of basic topology, and project that onto the complex topology like a mask.... Or something like that. I might go digging through that video again. Because I'm slightly less retarded about nodes today than I was before. So I might actually learn something.

Image not available

1424x735

file.png

Anonymous No. 921066

>>921057
I'm going to start this post by reiterating that i do not use blender

blender allows you to write out arbitrary data to your geometry using the "store named attribute" node. when attributes are read in via attr transfer they magically become something called "fields" evidently; someone at the blender foundation thought that stealing terminology from C4D would help - it does not.

i do not use blender, but i would figure out where on the geometry this data is stored (points, vertices, faces etc) and how that affects things down the line

blender however, does not allow you to overwrite normals using geometry nodes. this seems arbitrary. why was this choice made? perhaps it's because blender is bad software; but since i don't use it, i cannot say one way or other.

the method in pic-related writes the normals from the torus onto a new attribute "normalN". you are then expected to read this attribute into the shading context and do something with it there. this is needlessly complicated.

in summation, refer to the two hour long videos to which you have referred previously. they should be 5 minutes long, but such is life and the choices you have made. if only you had chosen a different software package.

๐Ÿ—‘๏ธ Anonymous No. 921067

>>921066
also, i will mention that reading in newN as normal for shading doesn't work because the original torus doesn't have a newN in the attached graph. it should be very simple to duplicate the normal attribute and call it newN to fix this problem, but then it should be simpler to do this whole thing in the first place.

Anonymous No. 921068

>>921066
also, i will mention that in the attached graph, reading in normalN as normal for shading doesn't work correctly because the original torus doesn't have a normalN so all its normals are just (0,0.0).
it should be very simple to duplicate the normal attribute and call it normalN to fix this problem, but then it should be simpler to do this whole thing in the first place.

Anonymous No. 921071

>>921066
>>921068
What do you use? Houdini? I would like to see an example of what I'm attempting to do, done in houdini. I'm seriously considering making the jump. Because I may be new to all this 3D stuff. But I have a feeling that transferring the normals to some instances should be possible. And it's looking impossible on Blender. I REALLY want this effect. I can't imagine working without it.

Image not available

1284x770

file.png

Anonymous No. 921073

>>921071
1. i don't know exactly what it is you're trying to do
2. in pic related i've transferred the point normals from the surface of the torus onto each of the points on the boxes being instanced, as evidenced by the blue vector visualiser showing which way the normals are pointed
3. it is possible to do what you want in blender, in fact, the example graph i've posted does have normals transferred over, they're just stored in an arbitrary attribute that has to be read in at a later time
4. if ^ is difficult for you, you're not smart enough for houdini

Image not available

1280x720

Normal Test 05.webm

Anonymous No. 921097

>>921073
1. I believe you grasp exactly what I'm doing. Now let's see that image rendered. Make the boxes and the torus the same color. They should blend in a cool a way, where when the boxes are overlapping the surface of the torus, they lose definition. But when they're more to the side, they poke out.

2. Your image makes a kind of sense to me. Except I don't know what null1 is doing. It's more or less what Blender does, except the normals actually transfer to a receptive node, whereas in blender, it doesn't.

Hang on. You son of a bitch. You actually did it. I'm not deleting what I previously wrote. Because imagine my surprise after I wrote all that, and then I put in your nodes and they actually fucking worked. I'm about to cum. Look at the results! This is what I'm going for. Look how cartoonily fuzzy that is. That's the shit right there.
What I did was pull up the attribute in the material, and then plug it into the normal of the principle shader. That's what transformed it. I got the idea from this video: https://www.youtube.com/watch?v=vEKVzszemxo And also from your houdini image. Because you're putting the normals into a color. So I figured I needed to make the attribute go into the shader.

But unfortunately, there was still a problem. The vectors of the instances weren't moving properly with the underlying torus. So after some fiddling, I actually managed to solve this issue on my own, by placing a rotation vector node between transfer attribute and the store named attribute. And now the vectors track properly!

This is basically what I was looking for. I'm sure there might be some tighter more efficient way to do it. I'm sure there are all kinds of extra features I can tack on top of this. But this is the basic functioning idea. Thanks for pretty much handing me the solution.

Image not available

2175x915

Normal Test 06.png

Anonymous No. 921098

>>921073
And secondly, here's the node set up. On the right is the shader node for the cone I used for instancing.

Anonymous No. 921101

>>921097
>>921098
congratulations. i hope the lessons you take away from this experience are:
1. data is just data, it doesn't matter what label you put on it, as long as you plug it in to the right place. you can bend it however you need to along the way as long as you know what you're doing.

this is why i take offence with blender treating normals like some special attribute that should not be touched by geo node users. it's just a vector. you should be able to set it directly on the geometry.

2. i don't use blender. but if i did, i would use the tutorials from entagma and junichiro horikawa to familiarise myself with geo nodes, because they are both houdini pros and explain things sensibly.

gl hf

Anonymous No. 921104

>>921097
>>921098
Thanks for posting your solution.

A question: if you add an Armature and/or shape keys to the torus, does the transfer still work when the torus is deformed? Right now you seem to be only using the object transform.

Image not available

1280x720

Normal Test 08.webm

Anonymous No. 921111

>>921104
I have no animation experience. So I just moved some bones around randomly. Seems to work.

Anonymous No. 921115

>>921111
>>921104
Also, it's rather expensive.Without the instances, the torus is only 1,152 triangles. With the instances, the torus is 32,340 triangles. And the blender manual warns that realizing instances makes things much more costly.
So you should be thinking of ways to slim this down. I don't really plant on making whole surfaces covered in fuzz like this torus. But rather patches here and there that stick out stylistically. I bet you could reduce the cost further by only using 1 or 2 triangles per instance, with a texture or something. I don't know.

Image not available

2001x1635

furstuff.png

Anonymous No. 921175

>>921098
Setting the store named from edge to point creates a smoother effect, and experiment with the materials, you can make some cool effects with it if you fiddle around enough
not sure if this info is useful

Image not available

520x349

Normal Test 09.png

Anonymous No. 921274

>>921175
You're right. Using Point in store named attribute is smoother. But I didn't have that weird jagged effect you did in the left image. Mine just toggled between two kinds of smooth. "Point" being almost imperceptibly smoother than edge.

It's like you read my mind on the toon shader. I was just thinking about setting up a toon shader for this. And I have one already that's kind of limited. So I was really thinking of reconfiguring it to be more capable of different lighting. I like the rim light. It's a nice touch. But maybe more can be done.

I copied your set up. Except I couldn't figure out what [Power] was accomplishing, so I left it out. You'll have to explain that bit to me. Anyway, I tacked on more nodes to get more shading steps in. So now the light has two parts, and the shadows have two parts. Half of the shadow is dedicated to a rim light. So no matter where you look, a bit of shading will contour the shadowed areas. This will help give shape to shadowed parts of the object.

And then I added this sort of half step on the light side. And the reason I did that, was that when you're taught to paint, they tell you to make a more saturated version of the color to transition into shadows. That's what the light side is doing. It's transitioning from light to a saturated version of the light, then being overtaken by the shadow. I'll draw it out in another image.

Image not available

1720x759

Normal Test 11.png

Anonymous No. 921276

>>921274
>>921175
And here's what it all looks like. I should mention that I did it inside of a group. Because wanted it to be something I can quickly plug into any material, and then just change the colors.

Also, I'm wanting to be able to blend all of these colors with environmental lighting. So they actually change color depending on the light source.

Image not available

1280x720

Normal Test 12.png

Anonymous No. 921284

>>921175
>>921274
>>921276
This is a better render than before. I added more nodes to create a faux-highlight. Using a Layer Weight node yet again to make it track the camera. But this time I used the "facing" option. I don't know why, but facing works better than fresnel. It actually blends evenly, and I can tune it without it going bonkers. So I changed all the Layer Weight nodes to facing. And played with the ColorRamp nodes some more to get a more distinct shadow. All of the shading should be a little more distinct now. This still isn't ideal. But it's the concept of what I'm looking for.

Image not available

1287x919

colorsep.png

Anonymous No. 921285

>>921274
the power was used to finetune the shadow, only works well with constant, I forgot to delete it in screenshot
>>921276
Not sure if there is a way to change the color of the object based on the color of the light when using a toon shader, you can have a mixRGB that has a input for adding color/factor at the end of the material (Idk, I could just be retarded)

Fun tip you can separate the RGB from the shader to RGB and use Red Blue and Green for 3 separate lights, example red light is the base light, then the green light can be the rim light, and blue and be a fill light for the shadows. this method is kinda clunky but if you want to have more control over the rim light effect and fill color then it could work

Anonymous No. 921287

>>921285
>Fun tip you can separate the RGB from the shader to RGB and use Red Blue and Green for 3 separate lights
I vaguely remember seeing that in a tutorial some months back. I completely forget the contents of the tutorial though. I didn't follow along, I just watched it out of curiosity to see what big complicated things people are making.

My intuition is telling me that there's a solution to making toon shading dynamic by separating RBG in nodes. Maybe I saw it in a tutorial somewhere and forgot it. I'm sure there's a way though. I feel it in my guy. We will just have to figure it out.

Anonymous No. 923783

Bumping thread, because I'm hoping someone comes up with a better toon shader.