๐งต Untitled Thread
Anonymous at Fri, 10 Nov 2023 16:58:42 UTC No. 963839
it's OFFICIALLY over...
for the last time.
it's not even a meme at this point.
im out, /3/.
see you, space cowboy...
Anonymous at Fri, 10 Nov 2023 17:56:40 UTC No. 963844
>>963839
It's all Disney/Pixar style stuff.
Anonymous at Fri, 10 Nov 2023 18:26:22 UTC No. 963850
>>963839
/g/ demoralization thread
Anonymous at Fri, 10 Nov 2023 18:38:41 UTC No. 963853
>>963839
ok now show a shot longer than 2 seconds with more than 1 character
Anonymous at Fri, 10 Nov 2023 19:21:15 UTC No. 963855
>>963844
That's basically 90% of what the market wants.
Anonymous at Fri, 10 Nov 2023 20:34:37 UTC No. 963865
>>963839
why does everyone have a worried expression on their face?
Anonymous at Fri, 10 Nov 2023 21:01:47 UTC No. 963868
>>963839
I think it will be used to generate inbetweens mostly instead of wasting a week rendering. Otherwise you can't control what it generates. For fx too I guess.
Anonymous at Fri, 10 Nov 2023 21:44:36 UTC No. 963875
>>963839
> my self esteem comes from the value of my craft
> machine devalues my craft
> I am worthless
> I kms
Anonymous at Fri, 10 Nov 2023 23:47:37 UTC No. 963880
>>963868
this sounds like a legit use case. instead of rendering a 24/30 fps movie, you can just do it at half rate and let the AI fill in the blanks.
edit: i just realized that is just literally DLSS3 or SVP video.
Anonymous at Sat, 11 Nov 2023 00:58:19 UTC No. 963886
>>963839
It all has that soulless look
Anonymous at Sun, 12 Nov 2023 05:48:21 UTC No. 963966
>>963868
finally someone fucking gets it
in betweens using AI are going to be huge
Anonymous at Sun, 12 Nov 2023 07:29:18 UTC No. 963968
there is no life in this "animation"
but yes where it can be used is for inbetween to really reduce render times
Anonymous at Sun, 12 Nov 2023 07:52:48 UTC No. 963970
>>963968
Not really worth it considering the same TPUs can be used for denoising.
I'd expect ray reconstruction to be available in production renderers in a few months.
Depends on geometry processing costs and if those are a bottlneck ofc.
Framerate interpolation kind of works for doubling but anything more will produce artifacts.
Anonymous at Sun, 12 Nov 2023 11:30:09 UTC No. 963974
>>963839
>same basedface in every shot
Anonymous at Mon, 13 Nov 2023 12:58:22 UTC No. 964041
>>963839
Holy fuck, this looks so good
>inb4 ewww, no it looks baad, muh sovl
Look at how text2video looked a few months ago, then a year ago, then- oh wait, text2video wasn't even a thing 2 years ago, yet we already have consistent style and coherent movement generation. You can get a 3D raytraced scene without doing any 3D modeling or raytracing
>but it's only 2 seconds
It's pointless to go for longer generation length when it still doesn't look perfect, plus I assume that longer generation is still way harder than getting good image quality
Anonymous at Mon, 13 Nov 2023 15:48:03 UTC No. 964054
>blurring the hand under the orb
>blue alien has 6 fingers on one hand, 3 on the other
Anonymous at Mon, 13 Nov 2023 20:46:28 UTC No. 964070
>>964041
>Raytraced
Isnt raytracing all about making the lighting look as realistic as possible? If thats the case then why would you pick AI that cant even fully figure out hands have 5 fingers most of the time? Yes, it is still a problem even in Dall-E 3.0, just sometimes. If it cannot even put red box on a blue box without shitting itself then how is it going to put shadows in the correct places? This is dumb, just make AI generate the whole 3D scene, render it on 5 fps and then let AI do the inbetweens better then current DLSS3 or SVP video like >>963880 mentioned here.
Anonymous at Wed, 15 Nov 2023 01:28:02 UTC No. 964257
I mean its cool but its so schizophrenic when you look at it for more than a second