• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 4 GDC feature techdemo screengrabs, unveil June [Up: New, Better Shots]

BigTnaples

Todd Howard's Secret GAF Account
Because games always look better than tech demos. WTF?


Sometimes yes, sometimes no.


This is a tech demo meant to show off very specific effects, this is not a game in progress like Gears was or likely Samaritan.


People need to see something they are familiar with upgraded to UE4 on next gen platforms.

Like I said, if Gears, Mass Effect, Batman Arkham , and dozens of other UE3 current gen heavy hitters don't blow gamers away running on Next gen hardware on UE4 I will eat my shoe.

The features shown off in this tech demo are impressive, but the way they are shown are hard to appreciate.

Bookmark this for when E3 2013 comes around...*

(*again providing that MS and Sony release competent hardware)
 
Epic is posturing.

I have a hard time believing it would spend the time and money creating UE4 without knowing it would work on next generation consoles.
 

i-Lo

Member
More like:

Polycount - no (you can clearly see it on characters in the first in-engine trailer; besides, we're talking CGI vs. polycounts optimized for real-time animation), but it's perfectly fine
Animation - not even close
Lighting - pretty much achieved, maybe even better
Particle effects - not even close

I called it like I saw it to be honest, regardless how limited the underlying processes are in real time. That's the same thing with this UE4 showcase. If the end result through raw power or clever illusion are the similar then I'd assume the general consumer would care to the same extent we do.

This is the reason why UE4 has quite a lot to prove.
 

Blizzard

Banned
tessellation is what allows for better performance. All it is is a smoother, more reliable and more flexible version to the LOD system we use today, with the added benefit of being able to render out actual geometry instead of normals.

Also, we're probably going to be looking at a dedicated hardware tessellator in the next consoles so it wont effect performance either way.

Developers either need to give every character a pair of goggles, visor, or glasses or they need to stop using lensflare, I don't get why it's become so popular lately. It doesn't happen in real-life unless you're constantly looking through a lense.
Problem is, tessellation is a performance killer currently (e.g. the silly huge number of polygons in the DX11 Crysis concrete barrier people were making fun of). It'd be nice if it made performance better, just like it'd be nice if DX11 performed better than DX9 in games, but in practice, Metro 2033 and possibly Project CARS are the only games I recall actually behaving in that fashion.
 

i-Lo

Member
Problem is, tessellation is a performance killer currently (e.g. the silly huge number of polygons in the DX11 Crysis concrete barrier people were making fun of). It'd be nice if it made performance better, just like it'd be nice if DX11 performed better than DX9 in games, but in practice, Metro 2033 and possibly Project CARS are the only games I recall actually behaving in that fashion.

As tessellation becomes more mainstream in game development, programmers will find efficient and creative ways of utilizing it that actually lend benefits to the game. This is another aspect of UE4 I am curious about.
 

Stallion Free

Cock Encumbered
Problem is, tessellation is a performance killer currently (e.g. the silly huge number of polygons in the DX11 Crysis concrete barrier people were making fun of). It'd be nice if it made performance better, just like it'd be nice if DX11 performed better than DX9 in games, but in practice, Metro 2033 and possibly Project CARS are the only games I recall actually behaving in that fashion.

Tesselation performance in the most recent AVP since they chose wisely where to use it.
 

Mr Swine

Banned
As tessellation becomes more mainstream in game development, programmers will find efficient and creative ways of utilizing it that actually lend benefits to the game. This is another aspect of UE4 I am curious about.

But we know this won't happen, most developers will use Tesselation to add more geometry to character models, weapons, environments and more thus making the games run worse than they should :p
 

roninjedi

Neo Member
You got to remember this is on a developers standpoint.

Plus the engine is more for easier development so the hardware does more work then software and cheap tricks.

Like todays games using UE3, you have to bake and trick the user to thinking its beautiful, which takes a lot of time especially in a pipeline. So adding a tree, house or character to a rejected game proposal becomes a rigid ordeal.

But if the article is right you can have higher hardware and UE4 as a developer you can cut out the tricks a just display what you want.

*Some tricks include level of detail could be cut out freeing the artist from making 2-6 different models of the same character. which saves on time.
 

Stallion Free

Cock Encumbered
But we know this won't happen, most developers will use Tesselation to add more geometry to character models, weapons, environments and more thus making the games run worse than they should :p

Actually character models ad weapons are great uses for it as those are things that the player is guaranteed to pay attention to and they are much easier to control for performance.
 

i-Lo

Member
Actually character models ad weapons are great uses for it as those are things that the player is guaranteed to pay attention to and they are much easier to control for performance.

Precisely. More importantly, I am hoping for tessellation aiding transitions between LODs of models as the viewer gets closer to them. Today, we still can see jarring changes (almost like pop in) in (for example) character models in (quite a few) games perhaps because they have 2 to 3 different LOD models for given characters that cycle depending upon how close you are to them.

Also, I'd like to see mitigation of pop-in of geometry even further (better draw distance allied with tessellation.. ).
 

Natty1

Member
Actually character models ad weapons are great uses for it as those are things that the player is guaranteed to pay attention to and they are much easier to control for performance.

This has been mentioned in this thread but I think LOD sounds like the best use. I thought this would save development time since they wouldn't have to make multiple models to swap out and it would provide a smoother (almost unnoticeable?) transition as it can apparently take out geometry dynamically.
I'm not going to pretend I'm an expert on the subject though and am just going by this video and some various things I've read:
http://www.youtube.com/watch?v=sQQpCd_vvGU

Edit: Guess I'm echoing the above post which was posted while I was re watching the video.
 
Problem is, tessellation is a performance killer currently (e.g. the silly huge number of polygons in the DX11 Crysis concrete barrier people were making fun of). It'd be nice if it made performance better, just like it'd be nice if DX11 performed better than DX9 in games, but in practice, Metro 2033 and possibly Project CARS are the only games I recall actually behaving in that fashion.

depends on the game, halo wars for example makes good use of tessellation.
 

BigTnaples

Todd Howard's Secret GAF Account
Samaritan is a game?

I said likely, we don't know for sure, but Samaritan seemed more realized art wise than the average tech demo.

Case in point, look at this tech demo, obviously not going to be a game..

However the original demo for UE3 turned out to be Gears of War.

Tech demos using game assets will 9 times out of ten be more impressive to the gamer than a tech demo with assets made for the sole purpose of a tech demo. Which is what we are seeing here.

If you think that Rocksteady wont turn this
batmanac2012-01-0201-l6kaw.png

iWk7yrirY7iGX.bmp

into something absolutely breathtaking with UE4 and proper Next Gen hardware then I don't know what to tell you.


Hell even if they did a straight UE4 port and added in all the new effects with the same assets it would look out of this world. Let alone being built from the ground up.

Same with Gears. Epic is going to blow people away with UE4 and the Gears and Hopefully Unreal franchises.


Just take a look at any of the current gen UE3 heavy hitters and tell me they would not look amazing with new lighting effects, bokeh DoF, Reflections, Tessellation, Volumetric effects, with the additional next gen standards like increased poly count, texture resolution and scale.
 

Stallion Free

Cock Encumbered
Just take a look at any of the current gen UE3 heavy hitters and tell me they would not look amazing with new lighting effects, bokeh DoF, Reflections, with the additional next gen standards like increased poly count and scale.

I've always had this fantasy where Epic releases the Gears trilogy under a single .exe on PC on the latest revision of the Unreal Engine. Gears 1 would be an insane improvement being relit with dynamic lighting. Combine that with tesselated characters and PC-tier image quality (1440p + proper AA + 60 fps) and I would buy the hell out of it.
 

WrikaWrek

Banned
As a game developer, what I'm looking forward to most is UE4's solution to real time osmorphic dithering. Lots of cool stuff, tho. The new 'geo-phasing' tool, which promises to unite autocorrelative aspect rending with lockdown framerate buffering has me extremely excited. Real-time terraforming and pre-phased shaders (seen on the mountain shot) should help with open world games. Plus I know they're working on a sub module for synchronous bipedal modelling which is going to revolutionize character wireframes. I'm optimistic.

bitchplease_41b31872aj2iut.jpg
 

dark10x

Digital Foundry pixel pusher
I've always had this fantasy where Epic releases the Gears trilogy under a single .exe on PC on the latest revision of the Unreal Engine. Gears 1 would be an insane improvement being relit with dynamic lighting. Combine that with tesselated characters and PC-tier image quality (1440p + proper AA + 60 fps) and I would buy the hell out of it.

That would be awesome.

Though, I've never been able to pull off most UE3 games on my PC at 1440p with SGSSA. I always lose 60 fps. :\

What's up with that?
 

peter

Banned
The screens have maybe some details that "could be" next gen, but the UE3 trailer was way much better than this. Also i'm sure wiiu can run this with a few changes and nobody will notice it.

Anyway we can discuss about the small details, one thing is sure.
The gfx doesn't come even close to the AVATAR movie. AVATAR looks like 10x times better than this.
 

Stallion Free

Cock Encumbered
Though, I've never been able to pull off most UE3 games on my PC at 1440p with SGSSA. I always lose 60 fps. :\

Lol you realize that 1440p + SGSSAA is essentially rendering 1440p out by a multiple of 4 or 8 in terms of performance? 4g of DDR5 on a single card required minimum. 600 series cards should be able to manage high levels of MSAA with 1440p at 60+ fps in UE3 just fine. UE4.0 is being designed around a DX11 rendering path and MSAA is more efficient than when implemented by devs through DX11 over the brute force ATI/Nvidia methods with deferred rendering DX9 engines.
 
It doesn't look 'real', but looks real enough to be something like a miniature. The mountain range looks gorgeous, but I cant unsee it being a miniature, downscaled physical model, kind of like what they use in film.
As I said earlier, it's too strong depth-of field. If the focus point is beyond, say 10 meters, nothing should be blurred ("hyper-focal distance")
 

dark10x

Digital Foundry pixel pusher
Lol you realize that 1440p + SGSSAA is essentially rendering 1440p out by a multiple of 4 or 8 in terms of performance? 4g of DDR5 on a single card required minimum. 600 series cards should be able to manage high levels of MSAA with 1440p at 60+ fps in UE3 just fine. UE4.0 is being designed around a DX11 rendering path and MSAA is more efficient than when implemented by devs through DX11 over the brute force ATI/Nvidia methods with deferred rendering DX9 engines.
Of course, but even MSAA doesn't run perfectly.

I'm pretty much stuck using FXAA with UE3 games.
 

RiverBed

Banned
Remember that tech companies show the tech, but they aren't the best in showing how good a game looks. Compare the first UE3 tech demos to the amazingly looking games made on it.
 

mdtauk

Member
Remember that tech companies show the tech, but they aren't the best in showing how good a game looks. Compare the first UE3 tech demos to the amazingly looking games made on it.

Epic however do make games themselves. But these all look the same ¬ _¬
 

i-Lo

Member
Remember that tech companies show the tech, but they aren't the best in showing how good a game looks. Compare the first UE3 tech demos to the amazingly looking games made on it.

That makes me wonder, what are the best looking (art wise) games that utilize UE3?
 
That makes me wonder, what are the best looking (art wise) games that utilize UE3?


There are only a few games using UE3 that impress me at all. Most have that same plain plastic look to them. ME2 and 3 manage to not look like UE3 games. And the Batman games impress me for with their scale and Batman: AA combination of scale and quick loading. But to me the most impressive game using UE3 is Tera. I can almost always pick out a IE3 game on first sight, but I never would have guessed that Tera uses that engine.
 

BigTnaples

Todd Howard's Secret GAF Account
That makes me wonder, what are the best looking (art wise) games that utilize UE3?

There have been so many.

Bioshock
Bioshock Infinite
Borderlands 1-2
Batman AA and AC
Gears 3
Mass Effect
Mirrors Edge
Alice
Bulletstorm
Dishonored
Enslaved
Mortal Kombat
Aliens Colonial Marines
BiA Hells Highway

etc.

All have pretty awesome artwork.
 
As a game developer, what I'm looking forward to most is UE4's solution to real time osmorphic dithering. Lots of cool stuff, tho. The new 'geo-phasing' tool, which promises to unite autocorrelative aspect rending with lockdown framerate buffering has me extremely excited. Real-time terraforming and pre-phased shaders (seen on the mountain shot) should help with open world games. Plus I know they're working on a sub module for synchronous bipedal modelling which is going to revolutionize character wireframes. I'm optimistic.
I feel like I'm reading a Star Trek fanfic.

Anyway I think people are way too critical of a tech demo made for developers. I also find it kind of funny that these shots remind me of the original Heavenly Sword tech demo which people thought would be the base line for this gen, lol.
 
Funny thing is there are people who I kid you not think that every light is going to have multiple bounces all in real time. o_O I just don't see this happening. It blows my mind as I watch this company's rendering farm slowly do racytracing.

Maybe a few lights doing like 2 bounces, but I don't see Epic literally leap frogging Crytek's solution. 2 light bounce from 1 main light source
 
As a game developer, what I'm looking forward to most is UE4's solution to real time osmorphic dithering. Lots of cool stuff, tho. The new 'geo-phasing' tool, which promises to unite autocorrelative aspect rending with lockdown framerate buffering has me extremely excited. Real-time terraforming and pre-phased shaders (seen on the mountain shot) should help with open world games. Plus I know they're working on a sub module for synchronous bipedal modelling which is going to revolutionize character wireframes. I'm optimistic.

What do you mean by synchronous?
 

scitek

Member
Why would they care about impressing the internet? Everyone who matters to them was already at the initial reveal of the engine or had a private showing. You know, the people who pay to license it.

Exactly. People need to understand this wasn't meant to impress the general consumer. The big deal here is that it was apparently all being done in real-time right from the editor, and things like the lighting no longer needing to "bake,"--similar to CE3--are also huge improvements.
 

i-Lo

Member
I dont even...


He did but you have to remember that he is selling a product.

Mark Rein isn't a pure PR guy. His background includes software engineering. So I am not going to count that as a mitigating circumstance.

This is also the reason why I am not judging until I see two things: 1) The demo in motion and 2) The first batch of final products created by this engine (due sometime in 2014).
 
Mark Reign isn't a pure PR guy. His background includes software engineering. So I am not going to count that as a mitigating circumstance.

This is also the reason why I am not judging until I see two things: 1) The demo in motion and 2) The first batch of final products created by this engine (due sometime in 2014).

Its Rein.
Rein is a smart guy don't get me wrong but one of his main jobs is to sell licenses of their engines. He's posturing. You should always take what he says about his engine with salt.
 
Shots don't wow me, but considering how widely use UE is right now, and that there are very good chances it still will be when future console and graphics hardware pop up, the though of a more flexible engine that helps developer save resources seems nice to me. I could very well be saying bullshit though, I'm really not educated when it comes to tech side of things.
 

i-Lo

Member
Its Rein.
Rein is a smart guy don't get me wrong but one of his main jobs is to sell licenses of their engines. He's posturing. You should always take what he says about his engine with salt.

Well, that's that then. This is the reason why I revere people like John Carmack and Tim Sweeney. They say it like it is almost all the time.

I just think it might have been a mistake on Epic's part releasing it to the public as screens when they themselves drove the stakes through the roof.

However, as aforementioned, we've got to wait and see its performance in motion and what it can do for the devs to constrain costs and save time. If UE becomes the one engine that almost all third parties end up using, then it has to feature the subjected measures.
 

Pachinko

Member
The only problem with those screens and possibly why many are unimpressed is that , flat out- the closeup of the demonlords head doesn't look that great, it's low poly and in a single screengrab nothing about it looks beyond what we see today in spades. Hell, it looks just like something in the game "overlord" to me.

You see "unreal 4.0" and you want to be just absolutly floored, these screens simply don't do that, even if you blow them up and look at the extra detail- which IS there, there isn't anything that really wows the shit out of ya. The other half of the demo sounds more likely to do that, playing with a ton of realtime light is very demanding performance wise but again, screens of such a thing won't likely do much. You will NEED to see video.

There's also epic building up their own hype with statements such as "unreal 4.0 will make samaritan look like dog shit" or similar hyperbole. Samaratan actually still looks pretty damn amazing , I'd be curious to see epic make a new version of that demo running in unreal 4.0 just to better showcase why the new engine is better. Everything about it sounds excellent on paper.

Mentioning the easier changing of assets is a huge plus to hear about, I don't work in the undustry but it's a sticking point to me as a customer realizing that the way games are made and have been made for the last 25 years is an unsustainable method of production.

8bit games took a half dozen people 6 months and a 100K
16 bit games took a dozen people 6 months to a year and 200k
32/64 bit games took 2 dozen people a year + and 500K
128 bit games took 4 dozen people a year or 2 and 1M or more
256(current gen)bit games take 80-100 people 18 months to 3 years + 10M or more

the logical next end point for that is troubling
Next gen games needing 200+ staff and 30 months or more dev time + 50M or more

Anything that can put a wrench into that equation is good in my books , I'm also exaggerating a bit , I'd say it's more accurate to say that the man hours required to make a game quadruple with each generation. This translates into say a 2600 game needing 640 hours of devtime, so 1 guy spends 4 months or perhaps 2 guys spend 2 months building it. Then on the NES , using the new resources available that games successer takes 2560 hours to build so 4 guys have to spend 4 months or 8 guys spend 2 months . Move that up to 16 bit and now you need 10240 man hours to build a game , that 8 man team could do it but it will take them 8 months or you can turn it into a 16 man team and build it in 4 months. Dev costs increase as you add staff and take longer to make things.

Get up to the playstation era and now you need 40960 man hours to make a game. Starting from that 1 guy example- that's 20 years of regular 40 hour weeks. Using the same size team as the snes era (16 people) you get a much more feasible 64 week dev time, if you double your staff you can still get a game out the door in roughly 6 months.

It's the leap from the playstation to the playstation 2 that starts throwing things into perspective.... 163840 man hours to build a game last generation. 80 years of work for 1 guy. Fast forward to the team size you had on ps1 though (32 people) and you get 2.5 years this means you can cut corners to make a smaller game and release earlier OR you hire more people again, doubling your team size to 64 means you might get a game finished in 15 months .. but you have to pay 64 people for those 15 months. This is why so many developers and publishers collapsed under the weight of failure last generation.

The absurdity involved going from playstation 2 to 360/ps3 is why EPIC's engine simplifying and speeding up as many processes as possible is absolutely required.
655360 man hours. Just read that aloud. These time estimates are based largely on the visual fidelity improvements I've witnessed with my own 2 eyes in the last 20 years. Compare gears of war 3 to atari combat and you'll realize that absurd number isn't really so far fetched. Our poor programmer from the 2600 game would require 315 years to make say gears of war 3. Even utilizing the 64 man team you put together for the ps2 generation ends up with just shy of 5 years of development time. This ends up with the same decision as last time- do you keep that team and toil over a game for the entire generation to make a product ? this is what's happened in japan with many developers (sony, square soft to name the most prominent) , do you double up the team size to 128 staffers to maybe squeeze that game out 2.5 years ? That's what many western devs and some japanese devs have done (capcom, ea games, activision) or do you take the option of least resistance that everyone else has taken so far these last few years- scale back the products size to half that of the previous gen game and release more sequels. There's also the option that only EA , take 2 and activision have done so far- quadrupling staff , 250 staffers can now toil over these mammoth projects to get them done in 15 months still. The cost involved in that method is very prohibitive though.

So the future of this is bordering on insanity- the number for those interested still is now 2621440 man hours. Dwell on that for a minute. Assuming the same leap forward we got going from ps2 to ps3 (perhaps it won't be that large?). Our immortal atari 2600 programmer now had to toil away for over 1200 years. Even our inflated mega budget team that only a half dozen studios worldwide can afford (256 staff) will have to spend 5 years to utilize the available quality. Again the same question is tossed out- how to compensate, the same answers show up. You can cut the size and scope of the game back 50% but even if you do that the non mega budget team of 128 people will still have to spend 5 years. You can move on from megabudget games to gargantuan budget games and have a staff of over 500 people, this is REALLY not feasible though. Let's say you , as a big time publisher decide to cut the games scope back by half and you hire enough people to increase your team size by 50% instead of a full 100%. So you've got nearly 400 people working on a game now just to get it done in 2 years. That's 70 million $ in wages NOT counting marketing/pressing/overtime.

I could go on but I think you guys have to be seeing why the biggest innovation with unreal 4.0 IS allowing for faster production with fewer people involved. We've hit a plateau where development cost in concerned and passing it , as you can see, simply isn't a very good option. At 70 million invested in a game the title (at 35$ income per copy sold) would require sales of 2 million copies at full price just to break even. Add in the cost of marketing that big project as well as probable overtime /bonuses and making money on such a venture would really require more like 3 million copies sold. This is all assuming the current pricing structure remains roughly the same. For all I know the price of a game at retail may go down a bit with a seperate dlc card being sold for the same price. This may make the real cost to consumer 25+40$, add in the development cost for the DLC features and perhaps you'd get by okay with 2.5 million sold instead.

How many games in the last 10 years have sold even 2.5 million within 3 months of release?

Idono, sorry to clog this thread up with ranting. Just trying to offer some perspective on why visual fidelity alone isn't what we should be looking at here.
 
Top Bottom