I agree with the above two posts. If you can figure out why, it's much better to fix stuff yourself for better knowledge than to let someone do it for you. I mean, it would make learning programming pretty ironic. Then again, there is a balance to consider, because you don't want to waste time learning Assembly to try to make a Photoshop clone.
That said, Java isn't a very good language to start into game programming. For one, its design philosophy doesn't really seem to support it from its basis. There's a reason most games aren't coded in Java. A good example is Minecraft. In this case, it's mostly to do with garbage collection than with actual code design (though I guess they go hand in hand) - Minecraft allocates 100-200Mb/s in memory, and then throws away what it doesn't use after a small period, which is very inefficient and can lead to lag spikes depending on how long that small period is. The author of Optifine, an optimization mod for Minecraft, has a write-up about it here. Of course, this is partially the coders' faults, since they could probably be handling the GC within Minecraft better using Java. If they were using C++, they'd be forced to use GC correctly, and actually, the way they do it now would be hard to try to implement.
I also heard that for the next generation of graphics APIs (Vulkan, DX12, Mantle) are trying to deal with this sort of issue in order to get more performance for users. Typically, in the past they're had a lot of error management (try/catch) to handle potential errors developers might have left in to make it easier for the user. Now they're relying on the developers to get it right, so that the end user has better performance when it actually works. It's hard to say how much of an improvement they'll have in the more popular graphics APIs like DX12 and glNext / Vulkan, but Mantle has already been used in a few cases, with obvious improvements over the alternative, DirectX 11, for the AMD cards.
All that goes to show that sometimes, in the hands of the developers, more freedom to run and frolic at the expense of messing up is better. One of the most common phrases I hear by great coders and electricians is "fail, and fail often." You don't learn anything by doing everything only correct the first time.
And really, logical errors are as important as arithmetic. It's boring, and can even be painful, but if you learn to do it right and do it a lot, it becomes a lot easier for you in the long run. In my Calc classes, my teacher made us do all the arithmetic and algebra in the problems we covered, even in the examples, just to make sure we didn't develop any bad habits along the way. Sure, you could just put it into a calculator, but then again, do you even need to program? You might as well just click a button that almost does what you want.
Programming is like math. You need to practice to get better and better.
Yeah, it is. I would say a big difference is, though, if you get something wrong, you have an incentive to fix it, so that you can get a result you want. With a bad math test, it's easy to put it away and to try not to think about it again.