Announcement

Collapse
No announcement yet.

My system good for 1070 ti?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • My system good for 1070 ti?

    I have a i5-4670 water cooled OC'd to 4Ghz in a MSI Z71-GD65
    16GB DDR3 RAM

    Want to upgrade to a 1070 TI. Will my system be a bottleneck?
    Last edited by GhostTX; 11-03-2017, 11:33 PM.
    "Self-government won't work without self-discipline." - Paul Harvey

  • #2
    no

    Comment


    • #3
      Originally posted by GhostTX View Post
      Will my system be a bottleneck?
      Definitely not. You wouldn't bottleneck an I5-3570k even at 1080p.
      WH

      Comment


      • #4
        I have an I5 4670K, 16gb, and an RX580, which is about 15% slower than a 1070, and the system doesn't bottleneck the card in the slightest.

        Comment


        • #5
          Nice. Thanks.

          Wanting more umph for FPS gaming. My 3 year old GTX 760 is getting a bit long in the tooth.
          "Self-government won't work without self-discipline." - Paul Harvey

          Comment


          • #6
            Not that it really matters with the CPU you have but, if you begin to have concern in the next year or two, shift up to 1440p if you can. I know for now most people still play at 1080, but if you can do 1440p you'd take nearly all the work off the CPU and put it on the card. I know a guy that is playing the latest Forza just fine with a lowly AMD FX-6300. That's like a 5 year old I3. But he's got a GTX-970 and he plays at 1440 on higher settings and its silky smooth according to him.
            WH

            Comment


            • #7
              Maybe a new monitor later then? My current tops or at 1080p.
              "Self-government won't work without self-discipline." - Paul Harvey

              Comment


              • #8
                I don't like telling people they should go out and buy the latest and greatest, but 1440p isn't even the latest and greatest. Of course that would be 4k. But 1080p is on its way out, and it does suck up cpu usage vs 1440p. That card will do 1440p very well. Thus a new 1440p monitor would have the effect of extending your CPU's usable life even further. For gaming.

                I'd recommend a Samsung with gloss finish and NOT matte. Matte is pretty much a bad decision on higher resolution monitors. Just make sure there isn't glare on it from windows in the room. If you're going from a matte monitor, you'll notice how the contrast is better with gloss and the colors pop more.
                WH

                Comment


                • #9
                  You should buy a 1070ti, and send it to me... In return, I'll send you my 1050ti... it's a fair trade, i promise
                  .

                  Comment


                  • #10
                    Ok this has been on my mind for awhile, cause its not something I'm used to. The next processor usually blew the old ones away. Well, it was typically at least well worth the upgrade if you had the need to upgrade. The rate of progression has slowed so much that my 5 year old I5 3570k is only just now seeing something worthy of replacing it. The I7 8700k. I think this is probably the answer. Moore's Law. From wiki:


                    "Moore's law is an observation or projection and not a physical or natural law. Although the rate held steady from 1975 until around 2012, the rate was faster during the first decade. In general, it is not logically sound to extrapolate from the historical growth rate into the indefinite future. For example, the 2010 update to the International Technology Roadmap for Semiconductors, predicted that growth would slow around 2013, and in 2015 Gordon Moore foresaw that the rate of progress would reach saturation: "I see Moore's law dying here in the next decade or so."

                    Intel stated in 2015 that the pace of advancement has slowed, starting at the 22 nm feature width around 2012, and continuing at 14 nm. Brian Krzanich, CEO of Intel, announced that "our cadence today is closer to two and a half years than two". This is scheduled to hold through the 10 nm width in late 2017. He cited Moore's 1975 revision as a precedent for the current deceleration, which results from technical challenges and is "a natural part of the history of Moore's law".

                    Krzanich and others in the industry expect Moore's law to continue indefinitely, "As we progress from 14 nanometer technology to 10 nanometer and plan for 7 nanometer and 5 nanometer and even beyond, our plans are proof that Moore’s Law is alive and well." However, other observers expect the geometrical reduction in scaling, the traditional formulation of Moore's law, may end by around 2025."

                    As you can see they talk about it starting to slow down before 2025 is reached. We're in that period now. I guess after 2025 its time for the Quantum computers to take over.
                    WH

                    Comment


                    • #11
                      The rate of progression has not slowed at all. CPU's are being replaced with faster and far more efficient processors. In the next 5-10 years you will start to see a shift from kitchen sink CPU's to form fitting ASIC chips, or FPGA's. In essence you should start to see more co-processors. I don't believe in Moore's law, but its progressing its just a different medium.

                      As well, ARM chips will start to come more prevalent soon, and the desktop computer will no longer be an expensive platform. You'll be able to build a really cheap computer and spend the extra money on the heavy lifting parts which is as it should be.

                      Comment


                      • #12
                        Originally posted by abecx View Post
                        I don't believe in Moore's law, but its progressing its just a different medium.
                        Intel talks a bit about how they tended to follow moore's law. https://www.intel.com/content/www/us...moore-law.html
                        WH

                        Comment


                        • #13
                          Originally posted by abecx View Post
                          I don't believe in Moore's law, but its progressing its just a different medium.
                          Neither do I. It's a bullshit prediction/theory that the industry arbitrarily decided to adhere to in order to make measurable progress with each new generation.

                          Comment


                          • #14
                            I'm not real clear on it either...

                            So Moore was saying that by 2025... what? There was going to be some kind of a halt to progress? I am curious to know what will happen when they finally get to 1 nanometer chips. Maybe they'll forever just be 1 nanometer, even 10 years later in 2035.
                            WH

                            Comment

                            Working...
                            X