Does Unreal Engine 4 support an Intel HD Graphics 3000 card?

I can see the specs but I want to know if those are the ‘recommended’ specs or the ‘absolutely required’ specs?

I’m hoping I can use it with my current card, I won’t by doing any fancy to start with. UDK 3 worked perfectly fine with little to no lag on my
computer. Obviously there would be missing features like Direct X 11 reflections, but I’m fine if I don’t have them. Again, I just need to know
if my computer can run Unreal Engine 4 so if it can then I can buy it.

If I’m not mistaken, you NEED a DX11 GPU now, as there is only the one rendering system in UE4.

I strongly recommend an upgrade when you’re able too though, Intel cards have their own ‘re****tion’ so to speak.

Oh wow, can you confirm this somehow? Is it only for early release that you need DX11 and will DX9 or DX10 be supported later or will it remain DX11 forever? This would really stop a lot of people from using Unreal Engine 4. :frowning:

Even if you can launch the UE4 with it you would suffer horribly while working in a level. You can just buy, even a second hand, gtx 400 series gfx and be fine with it instead of running into problems with what you have.

Currently I’m using a laptop so I can’t upgrade my graphics card. I was hoping that since UDK 3 was working perfectly for me that Unreal Engine 4 won’t be a problem to upgrade to. I am planning on buying a new computer in a couple of months but I’m really eager to get my hands on the Unreal Engine 4. Really hope I can use it with my current graphics card or else I’ll be waiting a long time. :confused:

I’m 99% sure that DX11 will remain the only supported renderer, simply because a huge proportion of the engines selling points are DX11-only features, like the reflection system. Things like GPU particles are also fairly integral now.

While you’re right on it cutting some peoples ease-of-access, it’s a fairly unrealistic ambition to develop on an under-equipped system, especially if you’re serious about development. Due to the approachability of Epic (which is no bad thing, you guys are awesome), people sometimes forget that they are working with industry-standard tools here, and the best of them at that. I believe the saying is “don’t bring a cocktail stick to a sword fight.” :wink: It’s like trying to play Battlefield 4 on Ultra on an old laptop, there are somethings that are simply not possible if the engine is to make any real progress.

Yeah I understand, I was just hoping I could get through with the basics with the Unreal Engine 4 and when I get into developing actual fancy levels when I get a new computer. I guess I’m stuck with UDK for now.

Since this is already a topic about non-highend hardware…

I have a powerful desktop which is going to run UE4 but I also have a laptop that I take around with me, so it would be cool if it runs on that too. It has an i7 4500U (with intel hd graphics 4400) and 8GB ram. It does run dx11. Do you think this will work?

I’m running it on a GTX 460 from nvidia, 8gb ram, 2.8 ghz quad core AMD processor and I can’t seem to break 45 fps, it dips below 30 fps when I go full screen and this is just on a basic scene. The question wasn’t about it being industry standard Jamsh, UDK itself was pretty high quality and still is, the issue is some developers and a lot of consumers may not have the hardware capability to develop for/run UE4 games atm. Even if the top developers are capable of developing for it on their machines, that doesn’t mean that people are going to be able to run them. It’ll take some time no doubt, I just think a lot of people were excited with the release of the engine and wanted to switch as soon as possible, and it seems like it will be bit of a long term process. At least where I’m sitting.

I’ll be able to tell you soon since downloading as I speak. Since using same, well I don’t have the luxury of being able to upgrade to a state of the art machine.

To be clear on if it’s even possible - I’ve just purchased UE4 on a laptop, and as I won’t run it on my desktop until tomorrow, I tried it on the laptop - it’s a no go I’m afraid, crashes immediately when you try to open the content examples project. My specs: i5 2.4ghz Dual Core with Intel HD 3000 (it does have a GeForce GT520M in it which is currently turned off, I haven’t tried it with that).

If you could get UE4 to run on the HD3000 it would probably be very slow. UE4 is basically meant for “Next-Gen” when it comes to games and especially for developing with it. If you can run UDK fine on your laptop your might want to wait cause I honestly don’t think it would run well if at all on the IntelHD 3000 chipset esepcially for making content. A finished game using UE4 might if the devs coded it properly but for content creation I highly doubt it would run well at all if at all.

What are your laptop specs?

Well I said it several times that I’m not planning on making a full out next gen game, for now I want Unreal Engine 4 just to learn the basics and play around. I’m not aiming to create anything fancy until I get a new machine.

I know the question wasn’t about the standard. However, if the consumers don’t have the ability to develop or run on UE4, then why try to develop on UE4? That’s not a UE4 issue, that’s the consumers issue.

My specs: Acer Aspire 5750g i5 2.4ghz Dual Core with Intel HD 3000 and 6GB RAM (it does have a GeForce GT520M in it which is currently turned off, I haven’t tried it with that).

I never once stated it was a ue4 issue, I simply remarked that it will be a while before indie developers can develop quality games with ue4, and for consumers to purchase ue4 powered games due to the huge hardware gap. Indie devs don’t have the luxury of super computers, and neither does your typical pc gamer. This is nothing new, with each new generation of gaming people have to upgrade their machines, it doesn’t happen overnight though.

You’re right, but my original post was just to make the point that ***ody should be able to expect to develop with UE4 smoothly on older or less capable hardware. Just that so far, I’ve seen a lot of negativity about the release because of the ‘high’ specifications which seems pretty backwards to me.

You’re right, but my original post was just to make the point that ***ody should be able to expect to develop with UE4 smoothly on older or less capable hardware. Just that so far, I’ve seen a lot of negativity about the release because of the ‘high’ specifications which seems pretty backwards to me.
[/QUOTE]

Honestly they should have minimum specs for people who can’t afford a powerful computer. Allowing it to run in DX9 or DX10 mode will be a good start.

Thing is though the Intel HD3000 onboard is not really good by it’s self it has issues playing some older games. If you have the dual config were it’s a HD3000+nVidiaGT if you turn the nVidia card on it should run but with onboard graphics like Intel HD3000/4000 those chipsets aren’t really meant for that kind of work. If I came off rude or anything I didn’t mean to. Also the UE4 is still kind of in beta is it not? The specs might get changed/lowered down the road who knows. Though engine developers when making engines especially ones like UE4 when they think of the people using it they probably are not thinking that people with onboard chipsets will be trying to use or dev with the latest tech, they are thinking that people using it have either a dedicated workstation or midhigh-highend gaming system.

It would be if it was possible, but it just isn’t. It’s just down to the way the engine and the new renderer works. It’s a complete rewrite. Besides, we still have UE3 for exactly that reason! I’m personally glad for the single renderer. It made developing in UE3 a bit of a nightmare at times, especially when downscaling a project to DX9, only to find that it couldn’t do a lot of what you were doing in DX11. Mesh-painted reflections was one thing I found I couldn’t work around in DX9.

On the side note, you can pick up a DX11 capable card for less than 3-digits worth of cash now (UK terms over here), which really isn’t that bad at all. The tools are aimed at users with those kinds of computers, there comes a point where its unreasonable to keep supporting less capable systems. I see why its frustrating for a lot of users, but at some point or another the move has to be made.