Your move
More seriously what do you hope to achieve?
4k panels are quite cheap right now, however for the money you do have some other options.
Games themselves had not quite embraced it yet in a terribly meaningful way -- most fps are not conducted at the kind of ranges it gets useful to see 4 pixels instead of 1 (4k for most is 4 1080p screens stapled together), rts is basically dead but in any case it tends not to allow you to see more of the map at once, I guess you could have more inventory or menus open in some games but if it is a problem there you could probably have got more screens. Splitscreen on a PC? Hahahahaha. Texture wise then like for like... I don't find much in it. Far more is gained from simple improvements in technology. A like for like game rendered at 4k... slightly prettier at a distance if the texture artist played to it. Maybe it will change in the future that would be the future and such a screen will be even cheaper then.
You don't have to take my word for it though. The same people that started out years ago getting widescreen to be a thing in games by hack or by patch kept on going into 4k type arenas after such things became the default
http://www.wsgf.org/mgl
It has uses outside games -- many cells spreadsheets in 4K is wonderful and I might almost be able to give up the second monitor with a suitably sized 4K display. 4K for any kind of productivity type work (I can't imagine doing 3d modelling for real on a 1080p screen again) where before you would have like multiple screens or flicked between windows is definitely something I would push for. If you like to edit video it also means you can have your 1080p video in full on the screen at all times and everything else around it, or more usefully handle higher resolution video if you have gone in for that. On the matter of 4K video... a mixed bag for me. Even on the shift to blu ray I would start to spot where sets and costumes would be less than stellar, where makeup artists could only hide so much and 4k does not make it better.
The other options are ultra wide, high refresh and I guess technically 3d.
4K is still a 16:9 aspect ratio. You can get monitors far wider than that (see the FOV thing on the site above as it plays into that) and those can do nice things for games... I hate to use immersive at this point as it is a ruined word but it really can be and it is far easier to hack a meaningful ultrawide mode into a game than it is to turn it into a 4K one. You can also get a tiny bit more vertical resolution (see 16:10) which can be nice for general day to day computing.
High refresh. According to some I am a luddite when it comes to refresh rate (I shall spare us the related discussion of proper motion blur as well). I have been there, done it many years ago and found not a lot in it outside of CRT displays and the only thing I find it good for is a dip from 30fps to 15 or something is going to be noticed, a 15fps dip from far higher is not. Still you appear to have a monster rig capable of cranking out frames in the hundreds of FPS range even when doing all the fancy tricks. For a well designed game that matters little (some older ones might couple framerate to input polling which can have some effect). Either way though if you only have a 60Hz monitor (which is most things since leaving CRT, except some older 4k stuff*) then it is only ever going to display 60 frames per second. 120Hz and 144Hz screens are then available and some enjoy those a lot. There are also screens which try to leave it all behind a bit with variable refresh rate (sometimes going under trade names of AMD's freesync or Nvidia's gsync, the former being a more open one and thus slightly more popular). There are some screens which will try to double frame rates and "smooth" motion out... not a fan myself. You can improve motion of video very well in software (
http://avisynth.org.ru/mvtools/mvtools2.html ) but it is not going to happen in real time for any kind of price a mortal can afford.
*the older HDMI 1.4 standard only did 4K at 30Hz. HDMI 2.0 does higher, DVI and displayport never had a problem in anything you are likely to find out there today. A lot of the older and then cheaper 4k displays only did 4K at those refresh rates and might have only had HDMI1.4 in.
3d. Has thankfully died again for the time being -- nobody is really making 3d films or TV right now. It is never the less possible to get things which display in 3d (almost invariably with glasses, quite often proprietary active shutter ones which need power and cost a lot). Some games did nice things with it and you have the films (most of which were done in post production and thus look awful). Is an option though, not one I am inclined to pick mind you.
You also have other concerns if you want them.
Latency/lag. The time taken to take a signal out of the card and put it on the display. Originally with CRT this was about as fast as... electricity really. Any change in the monitor was done by varying voltages and magnetic fields. As different computers got involved to tweak the image for various types of LCD this time started to creep up as getting computers to do that in real time is hideously expensive but giving it a little bit is far cheaper. Nobody cares when watching a DVD if technically there was a fraction of a second between hitting pause and the frame popping out of the TV, potentially rather more troubling when you are trying to click on someone's head. If you have ever seen a TV have a computer/game mode then this plays into that. Not all 4K displays are made equal here. Screen lag/latency is measured in millseconds, though there are some misleading terms some like to go with.
Getting a low latency 4K display might be harder (or more expensive) than doing the same for a lower resolution (even ultra wide). Note that you can have a lot of latency with a high refresh rate screen, most try not to but there are some out there with such drawbacks.
Colour reproduction. The colours on your display might not be exactly the same as certain reference standards. If you are into your high end video and image editing, printing and whatever else some really like having it be exactly the same as said print or when you send it to someone else then it can be nice to have. Bear in mind it can also go the other way; I once designed a nice website for someone using a lovely brown (my monitors have good colours). I later saw it on their screen (not so well calibrated) and... shall we say it was a less lovely shade of brown and rather more reminiscent of things you don't want your food website to remind people of. Audio engineers of the past recognised something similar as well -- your nice studio monitor speakers that change not a thing from the output are great and all but you mainly want to make it sound great coming out of the nastiest job site radio speaker.
HDR. High dynamic range. It means something slightly different in screens than in photography but similar idea -- your blacks will be black at the same time as your whites can burn your eyeballs out. Speaking of which brightness/contrast can be a thing for some.
Angle of viewing. Not so bad if you are doing a monitor for just yourself and can adjust accordingly but if you want people to look over your shoulder and the few degrees off they may well find themselves looking at far different colours or even a nice bit of grey.