Off Topic So slow I'm ashamed...

When on a desktop PC I get by with my old 2010 2.93Ghz i7 iMac. I'm not a gamer so I usually get at least 8 years or so out of my computers. I'm about as platform agnostic as anyone can get these days. I was a rabid Apple user in the late 80's and the dark 90's when many thought Apple wouldn't survive before Jobs return.
 
I have a projector, too. I don't game on it much at all. I did for a few weeks after I got it, but it's mostly for watching movies now.

I do too. 106" (diagonal) of HD glory. It's mostly for movies these days.
 
I've ditched my custom desktop 15 years ago and since have been on laptops for portability. My daughter and I recently upgraded to Dell 7567 (i5, GTX 1050Ti) and we play Overwatch almost 4-5 hours/day.

For all of the desktops posted, they're way overkill for web surfing (if that isn't obvious). Being the only technical support in the big family, I found most of the slowness comes from bloatwares on a decent machine, so make sure you remove unneeded software, disable unneeded services. I used to be a Windows users from the beginning, but has been an exclusive Ubuntu user for 11 years. My old Dell e7440 with a slow i5-4200U, 16GB, 256SDD was absolutely fast with all the dev tools running on it (Nginx, MySQL, Apache Solr, Memcache, Eclipse, ...). Also, for any web sites with ads that taxing down your computer (animated, video, a bunch of them on one page), punish them with AdBlocks and blacklist them with YesScript.

Projecting on a big wall can absolutely re-create that movie theater feeling, but it also lacks the contrast. I now prefer the local dimming TV (mine is Vizio 65"), a camp fire scene looks gorgeous with everything completely dark except for the fire. That's also the reason why most people are now preferring IPS over TN panels for external monitors (but for gaming, the fast-enough IPS are often expensive).
 
snip...

Projecting on a big wall can absolutely re-create that movie theater feeling, but it also lacks the contrast. I now prefer the local dimming TV (mine is Vizio 65"), a camp fire scene looks gorgeous with everything completely dark except for the fire. That's also the reason why most people are now preferring IPS over TN panels for external monitors (but for gaming, the fast-enough IPS are often expensive).

My projection system uses a high contrast screen so lack of contrast isn't an issue.
 
Mines not HD. Another reason I don't game on it.

Sent from my XT1585 using Tapatalk
 
Wow it's nice seeing all the PC enthusiasts here! I'm running an Alienware 15 R3 (6700HQ, 120hz panel, 1070, 24GB RAM, 1TB SSD)
 
Yea, after i turned a 40" tv into a monitor in like 2008 i havent used a small monitor since. 47" 120hz (480 emulate) 1080p is what im using now.
 
I upgrade mine in minor phases for the board components and only do a chipset upgrade every 4-5 years or so. I have a

- 6700k @ 4.7ghz (H60 cooler)
- MSI Z170A SLI Plus mobo
- MSI GTX 970
- 16GB DDR4 2400MHZ 15-15-15-35 timing
- Samsung EVO 850 250GB
- 2x WDD Caviar Black 1TB drives in RAID 0
- Corsair 700W modular PSU (my oldest part)
- Fractal Design Define R5 windowless case


Generally, gaming computers only really need a decent video card, 4GB total RAM, and a mediocre CPU with good single thread performance (this is why Ryzens suck for most games). More specifically, any cross Console/PC port AAA title will not require much. I play a lot of Arma 3, which is the other way around. It is much more CPU and RAM intensive than it is GPU because it makes thousands of ballistic calculations and AI decisions in real time, so if you want to give your gaming rig a run for it's money I would recommend trying that game (although, it is not for everyone). The GTX970 can take any current game on ultra settings just fine (@ 1080P), but I think I'll get a 1080 this year since their price plummeted due to the TI release. Although I will sooner swap over to 3000mhz ram. Initially I just went with 2400 since in about 99% of games anything past 2400mhz will yield diminishing returns, but memory speed in Arma (and I imagine a lot of simulators) is crucial to performance.

Monitors are pretty important, too. 4K doesn't look quite up to the beans yet so I'm going to uprade my 5+ year old monitors with a nice g-sync monitor.

Computers like any material hobby are a benchmark comparison. People like to buy the best possible stuff just so they have the latest and greatest, and will likely only use 10-20% of its potential, until it is deemed obsolete by the next generational release. I know of someone who had 2 1080s and is now getting rid of them just so he can have 2 1080 TIs. I'm pretty sure he just plays Overwatch which is kind of a joke in the visual fidelity department. That's kind of like buying a .338 lapua rifle and putting a dot sight on it and shooting at room-distance targets. That's cool I guess if you have that much money to burn, I'd rather spend it on stuff like guns, bourbon, or just not spend it.
 
Last edited:
I upgrade mine in minor phases for the board components and only do a chipset upgrade every 4-5 years or so. I have a

- 6700k @ 4.7ghz (H60 cooler)
- MSI Z170A SLI Plus mobo
- MSI GTX 970
- 16GB DDR4 2400MHZ 15-15-15-35 timing
- Samsung EVO 850 250GB
- 2x WDD Caviar Black 1TB drives in RAID 0
- Corsair 700W modular PSU (my oldest part)
- Fractal Design Define R5 windowless case


Generally, gaming computers only really need a decent video card, 4GB total RAM, and a mediocre CPU with good single thread performance (this is why Ryzens suck for most games). More specifically, any cross Console/PC port AAA title will not require much. I play a lot of Arma 3, which is the other way around. It is much more CPU and RAM intensive than it is GPU because it makes thousands of ballistic calculations and AI decisions in real time, so if you want to give your gaming rig a run for it's money I would recommend trying that game (although, it is not for everyone). The GTX970 can take any current game on ultra settings just fine (@ 1080P), but I think I'll get a 1080 this year since their price plummeted due to the TI release. Although I will sooner swap over to 3000mhz ram. Initially I just went with 2400 since in about 99% of games anything past 2400mhz will yield diminishing returns, but memory speed in Arma (and I imagine a lot of simulators) is crucial to performance.

Monitors are pretty important, too. 4K doesn't look quite up to the beans yet so I'm going to uprade my 5+ year old monitors with a nice g-sync monitor.

Computers like any material hobby are a benchmark comparison. People like to buy the best possible stuff just so they have the latest and greatest, and will likely only use 10-20% of its potential, until it is deemed obsolete by the next generational release. I know of someone who had 2 1080s and is now getting rid of them just so he can have 2 1080 TIs. I'm pretty sure he just plays Overwatch which is kind of a joke in the visual fidelity department. That's kind of like buying a .338 lapua rifle and putting a dot sight on it and shooting at room-distance targets. That's cool I guess if you have that much money to burn, I'd rather spend it on stuff like guns, bourbon, or just not spend it.
I bought what I bought so I could play current games like Witcher 3, overwatch, etc. With great framerates. And when the next generation of games come out in a year or 3 that are more leveraging of the cpu, play those, too.

A friend of mine has an i7 4xxxx, gtx970, 8gb ram ddr3.

He cranked up Wow last night on 1440, and maxed everything. Fps in orgrimmar was around 60, avg. And memory showed 80%+ utilization.

My pc used to do that about 5 years ago. Then 4 years ago, it barely hit 50, then 2 years ago, it became literally unplayable even at lowest settings.

My goal was to future proof my pc as much as possible, with the exception of knowing I'll need a new gpu in 2-4 years.

My i7-7700k can over clock to a very stable 6+ghz all day long with liquid cooling like I have. 32gd ddr4 should last a while (I find ram sticks go bad from time to time over the years though). Anyway, the only thing I forsee issues with in the next 5 years is the gpu. Maybe.

That is why I spent the $1750 inc. Tax and shipping...because it was cheaper than building a "minimum for Witcher 3, today" pc....which would have initially cost me around $1k, even cannibalizing my old PC. With virtually no futureproofing, and needing an upgrade likely in less than 2 years.

I had my last pc built in 2010, I believe. I just over clocked it to 3.6ghz 2 days ago. It's still foundering. It's at the bare edge of usability for net surfing and struggles with starcraft 2 on lower end settings. That is 7 years from a mid to upper level 2010 build that I pinches a few pennies on. I am really hoping for 8 years (technology increases faster than linear) out of this PC, which I consider a near top end build (within reason/for common gaming use). It just makes sense financially to do it this way rather than shooting for minimum spec for today.

Just like your rifle analogy, I bought a Daniel defense carbine for hunting and home defense. Now, I run a suppressor full time and pump thousands of rounds suppressed downrange every year with it. I bought smart at first, and now that I'm using it harder, it's not disappointing. Still shoots around 1" at 100. Next I need a .308, so I have an sr25 acc on order. It will kill deer...or make me competitive in PRS gas gun competition if I so choose.
 
Last edited:
Man you guys have some hefty gaming machines pushing legit HP.

My next system is most likely an old school 16-bit Super Nintendo...or Sega Genesis with "blast processing".
 
Man you guys have some hefty gaming machines pushing legit HP.

My next system is most likely an old school 16-bit Super Nintendo...or Sega Genesis with "blast processing".

I need a legit computer. Because I keep them for 5+ years and expect them to perform well past that. It saves money in the long run and hassle as well, so on day 1, my computers are always seriously overgunned.
 
I agree with everything Kevin said, except:

Generally, gaming computers only really need a decent video card, 4GB total RAM, and a mediocre CPU with good single thread performance

4GB is not enough for gaming. 8 GB is the minimum, 16 is the sweet spot. Anything more is simply overkill and not needed.

I was trying to prove this to a buddy who was asking me if I was sure 16GB was enough. I ran Battlefield 1 on one screen, and streamed a baseball game on SlingTV on the 2nd monitor. Total ram usage: just shy of 8GB. Strictly gaming here, if you're doing 3D work or rendering video, you might need more.

http://www.pcgamer.com/how-much-ram-do-you-really-need-for-gaming/

I started on PC. After PC. After PC. Got tired of chasing the upgrade dragon and switched to PS3 for a few years. After that, got a gaming laptop which I still have today. That was great for a few years. A friend and I each bought one and we would go over to a 3rd friends house and play BF4 together. Now I just built this bad boy a few months ago.
 
I agree with everything Kevin said, except:



4GB is not enough for gaming. 8 GB is the minimum, 16 is the sweet spot. Anything more is simply overkill and not needed.

I was trying to prove this to a buddy who was asking me if I was sure 16GB was enough. I ran Battlefield 1 on one screen, and streamed a baseball game on SlingTV on the 2nd monitor. Total ram usage: just shy of 8GB. Strictly gaming here, if you're doing 3D work or rendering video, you might need more.

http://www.pcgamer.com/how-much-ram-do-you-really-need-for-gaming/

I started on PC. After PC. After PC. Got tired of chasing the upgrade dragon and switched to PS3 for a few years. After that, got a gaming laptop which I still have today. That was great for a few years. A friend and I each bought one and we would go over to a 3rd friends house and play BF4 together. Now I just built this bad boy a few months ago.

I used to hate console gaming. Thought it was stupid. PC is so much more useful. Then a friend bought me an XBOX1 for christmas and I was obligated to play it with him. Changed my whole outlook. I still love my WoW and internet, but consoles rock, too. As to chasing the dragon. I don't. I slay it, and then feed off of it for 5-7 years, rofl!
 
I stopped upgrading my PC all the damn time once I put in custom CPU + GPU water cooling. Got top of line (for what I wanted) at the time and figured it would be good enough for several years. Still maxing everything I play out. I could upgrade more easily now since I put in a drain port on my last teardown/maintenance, but eh, tubing runs work how they are now and I'm too lazy to do it again.

As for consoles, haven't really played them since PS2. I grew up with PS2, so games like Jak and Daxter, Ace Combat, and the ol' James Bond shooters were my game. But I also played computer, and EverQuest was my addiction from like 1999 to 2003.
 
Last edited:
hell, i'm still running windows vista... and I get a blue screen every time I fire it up.. and it thinks its 2007 which causes a whole host of problems.
 
That benchmark was published on 3/14, and the patch just came out this week. Let's wait for the new benchmark when other games implement the patch.

I'm not saying Ryzen isn't G2G, I'm just saying that after my Phenom II, I am not an AMD guy anymore. I never really was, but it was cheap, and I was broke at the time. Ryzen seems to be the first time I can remember, though, that AMD and Intel have actually attempted to go head to head and noone chuckled. The Ryzen is legit, I freely grant.

That said, since Ryzen has slower native speeds, and I don't do 3D graphic design stuff or whatnot, I still think the I7 is going to dominate, for ME. Even if it doesn't, I don't think anyone can argue that I'm not GPU limited vs. CPU limited.

I did some research, and the 1080 by itself certainly could be improved upon even for 2560x1440 gaming. I kindof wish I had gone 1080 TI or 1080 TI SLI, but that's $$$ and the 1080 is decent. I won't need to upgrade for at least a year, but I also won't get near the 5 years out of it I'd hoped. A friend of mine is getting only 47-67fps with Ghost Recon: Wildlands on their Benchmarker, out of his 1080 GTX. The rest of his PC is not bottlenecking it, either.
 
Back